Nov 25 12:50:40 np0005535656 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 25 12:50:40 np0005535656 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 25 12:50:40 np0005535656 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 12:50:40 np0005535656 kernel: BIOS-provided physical RAM map:
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 25 12:50:40 np0005535656 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 25 12:50:40 np0005535656 kernel: NX (Execute Disable) protection: active
Nov 25 12:50:40 np0005535656 kernel: APIC: Static calls initialized
Nov 25 12:50:40 np0005535656 kernel: SMBIOS 2.8 present.
Nov 25 12:50:40 np0005535656 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 25 12:50:40 np0005535656 kernel: Hypervisor detected: KVM
Nov 25 12:50:40 np0005535656 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 25 12:50:40 np0005535656 kernel: kvm-clock: using sched offset of 4131756948 cycles
Nov 25 12:50:40 np0005535656 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 25 12:50:40 np0005535656 kernel: tsc: Detected 2799.998 MHz processor
Nov 25 12:50:40 np0005535656 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 25 12:50:40 np0005535656 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 25 12:50:40 np0005535656 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 25 12:50:40 np0005535656 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 25 12:50:40 np0005535656 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 25 12:50:40 np0005535656 kernel: Using GB pages for direct mapping
Nov 25 12:50:40 np0005535656 kernel: RAMDISK: [mem 0x2ed25000-0x3368afff]
Nov 25 12:50:40 np0005535656 kernel: ACPI: Early table checksum verification disabled
Nov 25 12:50:40 np0005535656 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 25 12:50:40 np0005535656 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 12:50:40 np0005535656 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 12:50:40 np0005535656 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 12:50:40 np0005535656 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 25 12:50:40 np0005535656 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 12:50:40 np0005535656 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 25 12:50:40 np0005535656 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 25 12:50:40 np0005535656 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 25 12:50:40 np0005535656 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 25 12:50:40 np0005535656 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 25 12:50:40 np0005535656 kernel: No NUMA configuration found
Nov 25 12:50:40 np0005535656 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 25 12:50:40 np0005535656 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 25 12:50:40 np0005535656 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 25 12:50:40 np0005535656 kernel: Zone ranges:
Nov 25 12:50:40 np0005535656 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 25 12:50:40 np0005535656 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 25 12:50:40 np0005535656 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 25 12:50:40 np0005535656 kernel:  Device   empty
Nov 25 12:50:40 np0005535656 kernel: Movable zone start for each node
Nov 25 12:50:40 np0005535656 kernel: Early memory node ranges
Nov 25 12:50:40 np0005535656 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 25 12:50:40 np0005535656 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 25 12:50:40 np0005535656 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 25 12:50:40 np0005535656 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 25 12:50:40 np0005535656 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 25 12:50:40 np0005535656 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 25 12:50:40 np0005535656 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 25 12:50:40 np0005535656 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 25 12:50:40 np0005535656 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 25 12:50:40 np0005535656 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 25 12:50:40 np0005535656 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 25 12:50:40 np0005535656 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 25 12:50:40 np0005535656 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 25 12:50:40 np0005535656 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 25 12:50:40 np0005535656 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 25 12:50:40 np0005535656 kernel: TSC deadline timer available
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Max. logical packages:   8
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Max. logical dies:       8
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Max. dies per package:   1
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Max. threads per core:   1
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Num. cores per package:     1
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Num. threads per package:   1
Nov 25 12:50:40 np0005535656 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 25 12:50:40 np0005535656 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 25 12:50:40 np0005535656 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 25 12:50:40 np0005535656 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 25 12:50:40 np0005535656 kernel: Booting paravirtualized kernel on KVM
Nov 25 12:50:40 np0005535656 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 25 12:50:40 np0005535656 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 25 12:50:40 np0005535656 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 25 12:50:40 np0005535656 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 25 12:50:40 np0005535656 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 12:50:40 np0005535656 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 25 12:50:40 np0005535656 kernel: random: crng init done
Nov 25 12:50:40 np0005535656 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: Fallback order for Node 0: 0 
Nov 25 12:50:40 np0005535656 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 25 12:50:40 np0005535656 kernel: Policy zone: Normal
Nov 25 12:50:40 np0005535656 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 25 12:50:40 np0005535656 kernel: software IO TLB: area num 8.
Nov 25 12:50:40 np0005535656 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 25 12:50:40 np0005535656 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 25 12:50:40 np0005535656 kernel: ftrace: allocated 193 pages with 3 groups
Nov 25 12:50:40 np0005535656 kernel: Dynamic Preempt: voluntary
Nov 25 12:50:40 np0005535656 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 25 12:50:40 np0005535656 kernel: rcu: #011RCU event tracing is enabled.
Nov 25 12:50:40 np0005535656 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 25 12:50:40 np0005535656 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 25 12:50:40 np0005535656 kernel: #011Rude variant of Tasks RCU enabled.
Nov 25 12:50:40 np0005535656 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 25 12:50:40 np0005535656 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 25 12:50:40 np0005535656 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 25 12:50:40 np0005535656 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 12:50:40 np0005535656 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 12:50:40 np0005535656 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 12:50:40 np0005535656 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 25 12:50:40 np0005535656 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 25 12:50:40 np0005535656 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 25 12:50:40 np0005535656 kernel: Console: colour VGA+ 80x25
Nov 25 12:50:40 np0005535656 kernel: printk: console [ttyS0] enabled
Nov 25 12:50:40 np0005535656 kernel: ACPI: Core revision 20230331
Nov 25 12:50:40 np0005535656 kernel: APIC: Switch to symmetric I/O mode setup
Nov 25 12:50:40 np0005535656 kernel: x2apic enabled
Nov 25 12:50:40 np0005535656 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 25 12:50:40 np0005535656 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 25 12:50:40 np0005535656 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 25 12:50:40 np0005535656 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 25 12:50:40 np0005535656 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 25 12:50:40 np0005535656 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 25 12:50:40 np0005535656 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 25 12:50:40 np0005535656 kernel: Spectre V2 : Mitigation: Retpolines
Nov 25 12:50:40 np0005535656 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 25 12:50:40 np0005535656 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 25 12:50:40 np0005535656 kernel: RETBleed: Mitigation: untrained return thunk
Nov 25 12:50:40 np0005535656 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 25 12:50:40 np0005535656 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 25 12:50:40 np0005535656 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 25 12:50:40 np0005535656 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 25 12:50:40 np0005535656 kernel: x86/bugs: return thunk changed
Nov 25 12:50:40 np0005535656 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 25 12:50:40 np0005535656 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 25 12:50:40 np0005535656 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 25 12:50:40 np0005535656 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 25 12:50:40 np0005535656 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 25 12:50:40 np0005535656 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 25 12:50:40 np0005535656 kernel: Freeing SMP alternatives memory: 40K
Nov 25 12:50:40 np0005535656 kernel: pid_max: default: 32768 minimum: 301
Nov 25 12:50:40 np0005535656 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 25 12:50:40 np0005535656 kernel: landlock: Up and running.
Nov 25 12:50:40 np0005535656 kernel: Yama: becoming mindful.
Nov 25 12:50:40 np0005535656 kernel: SELinux:  Initializing.
Nov 25 12:50:40 np0005535656 kernel: LSM support for eBPF active
Nov 25 12:50:40 np0005535656 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 25 12:50:40 np0005535656 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 25 12:50:40 np0005535656 kernel: ... version:                0
Nov 25 12:50:40 np0005535656 kernel: ... bit width:              48
Nov 25 12:50:40 np0005535656 kernel: ... generic registers:      6
Nov 25 12:50:40 np0005535656 kernel: ... value mask:             0000ffffffffffff
Nov 25 12:50:40 np0005535656 kernel: ... max period:             00007fffffffffff
Nov 25 12:50:40 np0005535656 kernel: ... fixed-purpose events:   0
Nov 25 12:50:40 np0005535656 kernel: ... event mask:             000000000000003f
Nov 25 12:50:40 np0005535656 kernel: signal: max sigframe size: 1776
Nov 25 12:50:40 np0005535656 kernel: rcu: Hierarchical SRCU implementation.
Nov 25 12:50:40 np0005535656 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 25 12:50:40 np0005535656 kernel: smp: Bringing up secondary CPUs ...
Nov 25 12:50:40 np0005535656 kernel: smpboot: x86: Booting SMP configuration:
Nov 25 12:50:40 np0005535656 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 25 12:50:40 np0005535656 kernel: smp: Brought up 1 node, 8 CPUs
Nov 25 12:50:40 np0005535656 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 25 12:50:40 np0005535656 kernel: node 0 deferred pages initialised in 9ms
Nov 25 12:50:40 np0005535656 kernel: Memory: 7776572K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 605560K reserved, 0K cma-reserved)
Nov 25 12:50:40 np0005535656 kernel: devtmpfs: initialized
Nov 25 12:50:40 np0005535656 kernel: x86/mm: Memory block size: 128MB
Nov 25 12:50:40 np0005535656 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 25 12:50:40 np0005535656 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: pinctrl core: initialized pinctrl subsystem
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 25 12:50:40 np0005535656 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 25 12:50:40 np0005535656 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 25 12:50:40 np0005535656 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 25 12:50:40 np0005535656 kernel: audit: initializing netlink subsys (disabled)
Nov 25 12:50:40 np0005535656 kernel: audit: type=2000 audit(1764093038.765:1): state=initialized audit_enabled=0 res=1
Nov 25 12:50:40 np0005535656 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 25 12:50:40 np0005535656 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 25 12:50:40 np0005535656 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 25 12:50:40 np0005535656 kernel: cpuidle: using governor menu
Nov 25 12:50:40 np0005535656 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 25 12:50:40 np0005535656 kernel: PCI: Using configuration type 1 for base access
Nov 25 12:50:40 np0005535656 kernel: PCI: Using configuration type 1 for extended access
Nov 25 12:50:40 np0005535656 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 25 12:50:40 np0005535656 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 25 12:50:40 np0005535656 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 25 12:50:40 np0005535656 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 25 12:50:40 np0005535656 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 25 12:50:40 np0005535656 kernel: Demotion targets for Node 0: null
Nov 25 12:50:40 np0005535656 kernel: cryptd: max_cpu_qlen set to 1000
Nov 25 12:50:40 np0005535656 kernel: ACPI: Added _OSI(Module Device)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Added _OSI(Processor Device)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 25 12:50:40 np0005535656 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 25 12:50:40 np0005535656 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 25 12:50:40 np0005535656 kernel: ACPI: Interpreter enabled
Nov 25 12:50:40 np0005535656 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 25 12:50:40 np0005535656 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 25 12:50:40 np0005535656 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 25 12:50:40 np0005535656 kernel: PCI: Using E820 reservations for host bridge windows
Nov 25 12:50:40 np0005535656 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 25 12:50:40 np0005535656 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [3] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [4] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [5] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [6] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [7] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [8] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [9] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [10] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [11] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [12] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [13] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [14] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [15] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [16] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [17] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [18] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [19] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [20] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [21] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [22] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [23] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [24] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [25] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [26] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [27] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [28] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [29] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [30] registered
Nov 25 12:50:40 np0005535656 kernel: acpiphp: Slot [31] registered
Nov 25 12:50:40 np0005535656 kernel: PCI host bridge to bus 0000:00
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 25 12:50:40 np0005535656 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 25 12:50:40 np0005535656 kernel: iommu: Default domain type: Translated
Nov 25 12:50:40 np0005535656 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 25 12:50:40 np0005535656 kernel: SCSI subsystem initialized
Nov 25 12:50:40 np0005535656 kernel: ACPI: bus type USB registered
Nov 25 12:50:40 np0005535656 kernel: usbcore: registered new interface driver usbfs
Nov 25 12:50:40 np0005535656 kernel: usbcore: registered new interface driver hub
Nov 25 12:50:40 np0005535656 kernel: usbcore: registered new device driver usb
Nov 25 12:50:40 np0005535656 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 25 12:50:40 np0005535656 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 25 12:50:40 np0005535656 kernel: PTP clock support registered
Nov 25 12:50:40 np0005535656 kernel: EDAC MC: Ver: 3.0.0
Nov 25 12:50:40 np0005535656 kernel: NetLabel: Initializing
Nov 25 12:50:40 np0005535656 kernel: NetLabel:  domain hash size = 128
Nov 25 12:50:40 np0005535656 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 25 12:50:40 np0005535656 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 25 12:50:40 np0005535656 kernel: PCI: Using ACPI for IRQ routing
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 25 12:50:40 np0005535656 kernel: vgaarb: loaded
Nov 25 12:50:40 np0005535656 kernel: clocksource: Switched to clocksource kvm-clock
Nov 25 12:50:40 np0005535656 kernel: VFS: Disk quotas dquot_6.6.0
Nov 25 12:50:40 np0005535656 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 25 12:50:40 np0005535656 kernel: pnp: PnP ACPI init
Nov 25 12:50:40 np0005535656 kernel: pnp: PnP ACPI: found 5 devices
Nov 25 12:50:40 np0005535656 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_INET protocol family
Nov 25 12:50:40 np0005535656 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 25 12:50:40 np0005535656 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_XDP protocol family
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 25 12:50:40 np0005535656 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 25 12:50:40 np0005535656 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 25 12:50:40 np0005535656 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71385 usecs
Nov 25 12:50:40 np0005535656 kernel: PCI: CLS 0 bytes, default 64
Nov 25 12:50:40 np0005535656 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 25 12:50:40 np0005535656 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 25 12:50:40 np0005535656 kernel: ACPI: bus type thunderbolt registered
Nov 25 12:50:40 np0005535656 kernel: Trying to unpack rootfs image as initramfs...
Nov 25 12:50:40 np0005535656 kernel: Initialise system trusted keyrings
Nov 25 12:50:40 np0005535656 kernel: Key type blacklist registered
Nov 25 12:50:40 np0005535656 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 25 12:50:40 np0005535656 kernel: zbud: loaded
Nov 25 12:50:40 np0005535656 kernel: integrity: Platform Keyring initialized
Nov 25 12:50:40 np0005535656 kernel: integrity: Machine keyring initialized
Nov 25 12:50:40 np0005535656 kernel: Freeing initrd memory: 75160K
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_ALG protocol family
Nov 25 12:50:40 np0005535656 kernel: xor: automatically using best checksumming function   avx       
Nov 25 12:50:40 np0005535656 kernel: Key type asymmetric registered
Nov 25 12:50:40 np0005535656 kernel: Asymmetric key parser 'x509' registered
Nov 25 12:50:40 np0005535656 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 25 12:50:40 np0005535656 kernel: io scheduler mq-deadline registered
Nov 25 12:50:40 np0005535656 kernel: io scheduler kyber registered
Nov 25 12:50:40 np0005535656 kernel: io scheduler bfq registered
Nov 25 12:50:40 np0005535656 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 25 12:50:40 np0005535656 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 25 12:50:40 np0005535656 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 25 12:50:40 np0005535656 kernel: ACPI: button: Power Button [PWRF]
Nov 25 12:50:40 np0005535656 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 25 12:50:40 np0005535656 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 25 12:50:40 np0005535656 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 25 12:50:40 np0005535656 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 25 12:50:40 np0005535656 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 25 12:50:40 np0005535656 kernel: Non-volatile memory driver v1.3
Nov 25 12:50:40 np0005535656 kernel: rdac: device handler registered
Nov 25 12:50:40 np0005535656 kernel: hp_sw: device handler registered
Nov 25 12:50:40 np0005535656 kernel: emc: device handler registered
Nov 25 12:50:40 np0005535656 kernel: alua: device handler registered
Nov 25 12:50:40 np0005535656 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 25 12:50:40 np0005535656 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 25 12:50:40 np0005535656 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 25 12:50:40 np0005535656 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 25 12:50:40 np0005535656 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 25 12:50:40 np0005535656 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 25 12:50:40 np0005535656 kernel: usb usb1: Product: UHCI Host Controller
Nov 25 12:50:40 np0005535656 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 25 12:50:40 np0005535656 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 25 12:50:40 np0005535656 kernel: hub 1-0:1.0: USB hub found
Nov 25 12:50:40 np0005535656 kernel: hub 1-0:1.0: 2 ports detected
Nov 25 12:50:40 np0005535656 kernel: usbcore: registered new interface driver usbserial_generic
Nov 25 12:50:40 np0005535656 kernel: usbserial: USB Serial support registered for generic
Nov 25 12:50:40 np0005535656 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 25 12:50:40 np0005535656 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 25 12:50:40 np0005535656 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 25 12:50:40 np0005535656 kernel: mousedev: PS/2 mouse device common for all mice
Nov 25 12:50:40 np0005535656 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 25 12:50:40 np0005535656 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 25 12:50:40 np0005535656 kernel: rtc_cmos 00:04: registered as rtc0
Nov 25 12:50:40 np0005535656 kernel: rtc_cmos 00:04: setting system clock to 2025-11-25T17:50:39 UTC (1764093039)
Nov 25 12:50:40 np0005535656 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 25 12:50:40 np0005535656 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 25 12:50:40 np0005535656 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 25 12:50:40 np0005535656 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 25 12:50:40 np0005535656 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 25 12:50:40 np0005535656 kernel: usbcore: registered new interface driver usbhid
Nov 25 12:50:40 np0005535656 kernel: usbhid: USB HID core driver
Nov 25 12:50:40 np0005535656 kernel: drop_monitor: Initializing network drop monitor service
Nov 25 12:50:40 np0005535656 kernel: Initializing XFRM netlink socket
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_INET6 protocol family
Nov 25 12:50:40 np0005535656 kernel: Segment Routing with IPv6
Nov 25 12:50:40 np0005535656 kernel: NET: Registered PF_PACKET protocol family
Nov 25 12:50:40 np0005535656 kernel: mpls_gso: MPLS GSO support
Nov 25 12:50:40 np0005535656 kernel: IPI shorthand broadcast: enabled
Nov 25 12:50:40 np0005535656 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 25 12:50:40 np0005535656 kernel: AES CTR mode by8 optimization enabled
Nov 25 12:50:40 np0005535656 kernel: sched_clock: Marking stable (1117006152, 147247605)->(1366948676, -102694919)
Nov 25 12:50:40 np0005535656 kernel: registered taskstats version 1
Nov 25 12:50:40 np0005535656 kernel: Loading compiled-in X.509 certificates
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 25 12:50:40 np0005535656 kernel: Demotion targets for Node 0: null
Nov 25 12:50:40 np0005535656 kernel: page_owner is disabled
Nov 25 12:50:40 np0005535656 kernel: Key type .fscrypt registered
Nov 25 12:50:40 np0005535656 kernel: Key type fscrypt-provisioning registered
Nov 25 12:50:40 np0005535656 kernel: Key type big_key registered
Nov 25 12:50:40 np0005535656 kernel: Key type encrypted registered
Nov 25 12:50:40 np0005535656 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 25 12:50:40 np0005535656 kernel: Loading compiled-in module X.509 certificates
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 12:50:40 np0005535656 kernel: ima: Allocated hash algorithm: sha256
Nov 25 12:50:40 np0005535656 kernel: ima: No architecture policies found
Nov 25 12:50:40 np0005535656 kernel: evm: Initialising EVM extended attributes:
Nov 25 12:50:40 np0005535656 kernel: evm: security.selinux
Nov 25 12:50:40 np0005535656 kernel: evm: security.SMACK64 (disabled)
Nov 25 12:50:40 np0005535656 kernel: evm: security.SMACK64EXEC (disabled)
Nov 25 12:50:40 np0005535656 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 25 12:50:40 np0005535656 kernel: evm: security.SMACK64MMAP (disabled)
Nov 25 12:50:40 np0005535656 kernel: evm: security.apparmor (disabled)
Nov 25 12:50:40 np0005535656 kernel: evm: security.ima
Nov 25 12:50:40 np0005535656 kernel: evm: security.capability
Nov 25 12:50:40 np0005535656 kernel: evm: HMAC attrs: 0x1
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 25 12:50:40 np0005535656 kernel: Running certificate verification RSA selftest
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 25 12:50:40 np0005535656 kernel: Running certificate verification ECDSA selftest
Nov 25 12:50:40 np0005535656 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 25 12:50:40 np0005535656 kernel: clk: Disabling unused clocks
Nov 25 12:50:40 np0005535656 kernel: Freeing unused decrypted memory: 2028K
Nov 25 12:50:40 np0005535656 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 25 12:50:40 np0005535656 kernel: Write protecting the kernel read-only data: 30720k
Nov 25 12:50:40 np0005535656 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 25 12:50:40 np0005535656 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 25 12:50:40 np0005535656 kernel: Run /init as init process
Nov 25 12:50:40 np0005535656 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 12:50:40 np0005535656 systemd: Detected virtualization kvm.
Nov 25 12:50:40 np0005535656 systemd: Detected architecture x86-64.
Nov 25 12:50:40 np0005535656 systemd: Running in initrd.
Nov 25 12:50:40 np0005535656 systemd: No hostname configured, using default hostname.
Nov 25 12:50:40 np0005535656 systemd: Hostname set to <localhost>.
Nov 25 12:50:40 np0005535656 systemd: Initializing machine ID from VM UUID.
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: Manufacturer: QEMU
Nov 25 12:50:40 np0005535656 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 25 12:50:40 np0005535656 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 25 12:50:40 np0005535656 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 25 12:50:40 np0005535656 systemd: Queued start job for default target Initrd Default Target.
Nov 25 12:50:40 np0005535656 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 12:50:40 np0005535656 systemd: Reached target Local Encrypted Volumes.
Nov 25 12:50:40 np0005535656 systemd: Reached target Initrd /usr File System.
Nov 25 12:50:40 np0005535656 systemd: Reached target Local File Systems.
Nov 25 12:50:40 np0005535656 systemd: Reached target Path Units.
Nov 25 12:50:40 np0005535656 systemd: Reached target Slice Units.
Nov 25 12:50:40 np0005535656 systemd: Reached target Swaps.
Nov 25 12:50:40 np0005535656 systemd: Reached target Timer Units.
Nov 25 12:50:40 np0005535656 systemd: Listening on D-Bus System Message Bus Socket.
Nov 25 12:50:40 np0005535656 systemd: Listening on Journal Socket (/dev/log).
Nov 25 12:50:40 np0005535656 systemd: Listening on Journal Socket.
Nov 25 12:50:40 np0005535656 systemd: Listening on udev Control Socket.
Nov 25 12:50:40 np0005535656 systemd: Listening on udev Kernel Socket.
Nov 25 12:50:40 np0005535656 systemd: Reached target Socket Units.
Nov 25 12:50:40 np0005535656 systemd: Starting Create List of Static Device Nodes...
Nov 25 12:50:40 np0005535656 systemd: Starting Journal Service...
Nov 25 12:50:40 np0005535656 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 12:50:40 np0005535656 systemd: Starting Apply Kernel Variables...
Nov 25 12:50:40 np0005535656 systemd: Starting Create System Users...
Nov 25 12:50:40 np0005535656 systemd: Starting Setup Virtual Console...
Nov 25 12:50:40 np0005535656 systemd: Finished Create List of Static Device Nodes.
Nov 25 12:50:40 np0005535656 systemd: Finished Apply Kernel Variables.
Nov 25 12:50:40 np0005535656 systemd: Finished Create System Users.
Nov 25 12:50:40 np0005535656 systemd: Starting Create Static Device Nodes in /dev...
Nov 25 12:50:40 np0005535656 systemd-journald[307]: Journal started
Nov 25 12:50:40 np0005535656 systemd-journald[307]: Runtime Journal (/run/log/journal/f1b9744174f24cd5987c5e759ff70e72) is 8.0M, max 153.6M, 145.6M free.
Nov 25 12:50:40 np0005535656 systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 25 12:50:40 np0005535656 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 25 12:50:40 np0005535656 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 25 12:50:40 np0005535656 systemd: Started Journal Service.
Nov 25 12:50:40 np0005535656 systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 12:50:40 np0005535656 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 12:50:40 np0005535656 systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 12:50:40 np0005535656 systemd[1]: Finished Setup Virtual Console.
Nov 25 12:50:40 np0005535656 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 25 12:50:40 np0005535656 systemd[1]: Starting dracut cmdline hook...
Nov 25 12:50:40 np0005535656 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Nov 25 12:50:40 np0005535656 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 12:50:40 np0005535656 systemd[1]: Finished dracut cmdline hook.
Nov 25 12:50:40 np0005535656 systemd[1]: Starting dracut pre-udev hook...
Nov 25 12:50:40 np0005535656 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 25 12:50:40 np0005535656 kernel: device-mapper: uevent: version 1.0.3
Nov 25 12:50:40 np0005535656 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 25 12:50:40 np0005535656 kernel: RPC: Registered named UNIX socket transport module.
Nov 25 12:50:40 np0005535656 kernel: RPC: Registered udp transport module.
Nov 25 12:50:40 np0005535656 kernel: RPC: Registered tcp transport module.
Nov 25 12:50:40 np0005535656 kernel: RPC: Registered tcp-with-tls transport module.
Nov 25 12:50:40 np0005535656 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 25 12:50:40 np0005535656 rpc.statd[444]: Version 2.5.4 starting
Nov 25 12:50:40 np0005535656 rpc.statd[444]: Initializing NSM state
Nov 25 12:50:40 np0005535656 rpc.idmapd[449]: Setting log level to 0
Nov 25 12:50:40 np0005535656 systemd[1]: Finished dracut pre-udev hook.
Nov 25 12:50:40 np0005535656 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 12:50:40 np0005535656 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 12:50:41 np0005535656 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 12:50:41 np0005535656 systemd[1]: Starting dracut pre-trigger hook...
Nov 25 12:50:41 np0005535656 systemd[1]: Finished dracut pre-trigger hook.
Nov 25 12:50:41 np0005535656 systemd[1]: Starting Coldplug All udev Devices...
Nov 25 12:50:41 np0005535656 systemd[1]: Created slice Slice /system/modprobe.
Nov 25 12:50:41 np0005535656 systemd[1]: Starting Load Kernel Module configfs...
Nov 25 12:50:41 np0005535656 systemd[1]: Finished Coldplug All udev Devices.
Nov 25 12:50:41 np0005535656 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 12:50:41 np0005535656 systemd[1]: Finished Load Kernel Module configfs.
Nov 25 12:50:41 np0005535656 systemd[1]: Mounting Kernel Configuration File System...
Nov 25 12:50:41 np0005535656 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Network.
Nov 25 12:50:41 np0005535656 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 12:50:41 np0005535656 systemd[1]: Starting dracut initqueue hook...
Nov 25 12:50:41 np0005535656 systemd[1]: Mounted Kernel Configuration File System.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target System Initialization.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Basic System.
Nov 25 12:50:41 np0005535656 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 25 12:50:41 np0005535656 kernel: scsi host0: ata_piix
Nov 25 12:50:41 np0005535656 kernel: scsi host1: ata_piix
Nov 25 12:50:41 np0005535656 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 25 12:50:41 np0005535656 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 25 12:50:41 np0005535656 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 25 12:50:41 np0005535656 kernel: vda: vda1
Nov 25 12:50:41 np0005535656 kernel: ata1: found unknown device (class 0)
Nov 25 12:50:41 np0005535656 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 25 12:50:41 np0005535656 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 25 12:50:41 np0005535656 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 25 12:50:41 np0005535656 systemd-udevd[493]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 12:50:41 np0005535656 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 25 12:50:41 np0005535656 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 25 12:50:41 np0005535656 systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Initrd Root Device.
Nov 25 12:50:41 np0005535656 systemd[1]: Finished dracut initqueue hook.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 25 12:50:41 np0005535656 systemd[1]: Reached target Remote File Systems.
Nov 25 12:50:41 np0005535656 systemd[1]: Starting dracut pre-mount hook...
Nov 25 12:50:41 np0005535656 systemd[1]: Finished dracut pre-mount hook.
Nov 25 12:50:41 np0005535656 systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 25 12:50:41 np0005535656 systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Nov 25 12:50:41 np0005535656 systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 12:50:41 np0005535656 systemd[1]: Mounting /sysroot...
Nov 25 12:50:42 np0005535656 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 25 12:50:42 np0005535656 kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 25 12:50:42 np0005535656 kernel: XFS (vda1): Ending clean mount
Nov 25 12:50:42 np0005535656 systemd[1]: Mounted /sysroot.
Nov 25 12:50:42 np0005535656 systemd[1]: Reached target Initrd Root File System.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 25 12:50:42 np0005535656 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 25 12:50:42 np0005535656 systemd[1]: Reached target Initrd File Systems.
Nov 25 12:50:42 np0005535656 systemd[1]: Reached target Initrd Default Target.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting dracut mount hook...
Nov 25 12:50:42 np0005535656 systemd[1]: Finished dracut mount hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 25 12:50:42 np0005535656 rpc.idmapd[449]: exiting on signal 15
Nov 25 12:50:42 np0005535656 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Network.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Timer Units.
Nov 25 12:50:42 np0005535656 systemd[1]: dbus.socket: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Initrd Default Target.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Basic System.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Initrd Root Device.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Initrd /usr File System.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Path Units.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Remote File Systems.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Slice Units.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Socket Units.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target System Initialization.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Local File Systems.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Swaps.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut mount hook.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut pre-mount hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut initqueue hook.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Apply Kernel Variables.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Coldplug All udev Devices.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut pre-trigger hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Setup Virtual Console.
Nov 25 12:50:42 np0005535656 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Closed udev Control Socket.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Closed udev Kernel Socket.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut pre-udev hook.
Nov 25 12:50:42 np0005535656 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped dracut cmdline hook.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting Cleanup udev Database...
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 25 12:50:42 np0005535656 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 25 12:50:42 np0005535656 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Stopped Create System Users.
Nov 25 12:50:42 np0005535656 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 25 12:50:42 np0005535656 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 25 12:50:42 np0005535656 systemd[1]: Finished Cleanup udev Database.
Nov 25 12:50:42 np0005535656 systemd[1]: Reached target Switch Root.
Nov 25 12:50:42 np0005535656 systemd[1]: Starting Switch Root...
Nov 25 12:50:42 np0005535656 systemd[1]: Switching root.
Nov 25 12:50:42 np0005535656 systemd-journald[307]: Journal stopped
Nov 25 12:50:43 np0005535656 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 25 12:50:43 np0005535656 kernel: audit: type=1404 audit(1764093042.791:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 12:50:43 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 12:50:43 np0005535656 kernel: audit: type=1403 audit(1764093042.961:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 25 12:50:43 np0005535656 systemd: Successfully loaded SELinux policy in 174.918ms.
Nov 25 12:50:43 np0005535656 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 34.017ms.
Nov 25 12:50:43 np0005535656 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 12:50:43 np0005535656 systemd: Detected virtualization kvm.
Nov 25 12:50:43 np0005535656 systemd: Detected architecture x86-64.
Nov 25 12:50:43 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 12:50:43 np0005535656 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd: Stopped Switch Root.
Nov 25 12:50:43 np0005535656 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 25 12:50:43 np0005535656 systemd: Created slice Slice /system/getty.
Nov 25 12:50:43 np0005535656 systemd: Created slice Slice /system/serial-getty.
Nov 25 12:50:43 np0005535656 systemd: Created slice Slice /system/sshd-keygen.
Nov 25 12:50:43 np0005535656 systemd: Created slice User and Session Slice.
Nov 25 12:50:43 np0005535656 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 12:50:43 np0005535656 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 25 12:50:43 np0005535656 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 25 12:50:43 np0005535656 systemd: Reached target Local Encrypted Volumes.
Nov 25 12:50:43 np0005535656 systemd: Stopped target Switch Root.
Nov 25 12:50:43 np0005535656 systemd: Stopped target Initrd File Systems.
Nov 25 12:50:43 np0005535656 systemd: Stopped target Initrd Root File System.
Nov 25 12:50:43 np0005535656 systemd: Reached target Local Integrity Protected Volumes.
Nov 25 12:50:43 np0005535656 systemd: Reached target Path Units.
Nov 25 12:50:43 np0005535656 systemd: Reached target rpc_pipefs.target.
Nov 25 12:50:43 np0005535656 systemd: Reached target Slice Units.
Nov 25 12:50:43 np0005535656 systemd: Reached target Swaps.
Nov 25 12:50:43 np0005535656 systemd: Reached target Local Verity Protected Volumes.
Nov 25 12:50:43 np0005535656 systemd: Listening on RPCbind Server Activation Socket.
Nov 25 12:50:43 np0005535656 systemd: Reached target RPC Port Mapper.
Nov 25 12:50:43 np0005535656 systemd: Listening on Process Core Dump Socket.
Nov 25 12:50:43 np0005535656 systemd: Listening on initctl Compatibility Named Pipe.
Nov 25 12:50:43 np0005535656 systemd: Listening on udev Control Socket.
Nov 25 12:50:43 np0005535656 systemd: Listening on udev Kernel Socket.
Nov 25 12:50:43 np0005535656 systemd: Mounting Huge Pages File System...
Nov 25 12:50:43 np0005535656 systemd: Mounting POSIX Message Queue File System...
Nov 25 12:50:43 np0005535656 systemd: Mounting Kernel Debug File System...
Nov 25 12:50:43 np0005535656 systemd: Mounting Kernel Trace File System...
Nov 25 12:50:43 np0005535656 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 12:50:43 np0005535656 systemd: Starting Create List of Static Device Nodes...
Nov 25 12:50:43 np0005535656 systemd: Starting Load Kernel Module configfs...
Nov 25 12:50:43 np0005535656 systemd: Starting Load Kernel Module drm...
Nov 25 12:50:43 np0005535656 systemd: Starting Load Kernel Module efi_pstore...
Nov 25 12:50:43 np0005535656 systemd: Starting Load Kernel Module fuse...
Nov 25 12:50:43 np0005535656 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 25 12:50:43 np0005535656 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd: Stopped File System Check on Root Device.
Nov 25 12:50:43 np0005535656 systemd: Stopped Journal Service.
Nov 25 12:50:43 np0005535656 kernel: fuse: init (API version 7.37)
Nov 25 12:50:43 np0005535656 systemd: Starting Journal Service...
Nov 25 12:50:43 np0005535656 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 12:50:43 np0005535656 systemd: Starting Generate network units from Kernel command line...
Nov 25 12:50:43 np0005535656 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 12:50:43 np0005535656 systemd: Starting Remount Root and Kernel File Systems...
Nov 25 12:50:43 np0005535656 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 25 12:50:43 np0005535656 systemd: Starting Apply Kernel Variables...
Nov 25 12:50:43 np0005535656 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 25 12:50:43 np0005535656 systemd-journald[679]: Journal started
Nov 25 12:50:43 np0005535656 systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 12:50:43 np0005535656 systemd[1]: Queued start job for default target Multi-User System.
Nov 25 12:50:43 np0005535656 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd: Starting Coldplug All udev Devices...
Nov 25 12:50:43 np0005535656 systemd: Started Journal Service.
Nov 25 12:50:43 np0005535656 systemd[1]: Mounted Huge Pages File System.
Nov 25 12:50:43 np0005535656 kernel: ACPI: bus type drm_connector registered
Nov 25 12:50:43 np0005535656 systemd[1]: Mounted POSIX Message Queue File System.
Nov 25 12:50:43 np0005535656 systemd[1]: Mounted Kernel Debug File System.
Nov 25 12:50:43 np0005535656 systemd[1]: Mounted Kernel Trace File System.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Create List of Static Device Nodes.
Nov 25 12:50:43 np0005535656 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Load Kernel Module configfs.
Nov 25 12:50:43 np0005535656 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Load Kernel Module drm.
Nov 25 12:50:43 np0005535656 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 25 12:50:43 np0005535656 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Load Kernel Module fuse.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Generate network units from Kernel command line.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 25 12:50:43 np0005535656 systemd[1]: Mounting FUSE Control File System...
Nov 25 12:50:43 np0005535656 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Rebuild Hardware Database...
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 25 12:50:43 np0005535656 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Load/Save OS Random Seed...
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Create System Users...
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Apply Kernel Variables.
Nov 25 12:50:43 np0005535656 systemd[1]: Mounted FUSE Control File System.
Nov 25 12:50:43 np0005535656 systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 12:50:43 np0005535656 systemd-journald[679]: Received client request to flush runtime journal.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Coldplug All udev Devices.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Load/Save OS Random Seed.
Nov 25 12:50:43 np0005535656 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Create System Users.
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 12:50:43 np0005535656 systemd[1]: Reached target Preparation for Local File Systems.
Nov 25 12:50:43 np0005535656 systemd[1]: Reached target Local File Systems.
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 25 12:50:43 np0005535656 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 25 12:50:43 np0005535656 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 25 12:50:43 np0005535656 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Automatic Boot Loader Update...
Nov 25 12:50:43 np0005535656 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 25 12:50:43 np0005535656 systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 12:50:43 np0005535656 bootctl[698]: Couldn't find EFI system partition, skipping.
Nov 25 12:50:43 np0005535656 systemd[1]: Finished Automatic Boot Loader Update.
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Security Auditing Service...
Nov 25 12:50:44 np0005535656 systemd[1]: Starting RPC Bind...
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Rebuild Journal Catalog...
Nov 25 12:50:44 np0005535656 auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 25 12:50:44 np0005535656 auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 25 12:50:44 np0005535656 systemd[1]: Started RPC Bind.
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Rebuild Journal Catalog.
Nov 25 12:50:44 np0005535656 augenrules[709]: /sbin/augenrules: No change
Nov 25 12:50:44 np0005535656 augenrules[724]: No rules
Nov 25 12:50:44 np0005535656 augenrules[724]: enabled 1
Nov 25 12:50:44 np0005535656 augenrules[724]: failure 1
Nov 25 12:50:44 np0005535656 augenrules[724]: pid 704
Nov 25 12:50:44 np0005535656 augenrules[724]: rate_limit 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_limit 8192
Nov 25 12:50:44 np0005535656 augenrules[724]: lost 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog 3
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time 60000
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time_actual 0
Nov 25 12:50:44 np0005535656 augenrules[724]: enabled 1
Nov 25 12:50:44 np0005535656 augenrules[724]: failure 1
Nov 25 12:50:44 np0005535656 augenrules[724]: pid 704
Nov 25 12:50:44 np0005535656 augenrules[724]: rate_limit 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_limit 8192
Nov 25 12:50:44 np0005535656 augenrules[724]: lost 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time 60000
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time_actual 0
Nov 25 12:50:44 np0005535656 augenrules[724]: enabled 1
Nov 25 12:50:44 np0005535656 augenrules[724]: failure 1
Nov 25 12:50:44 np0005535656 augenrules[724]: pid 704
Nov 25 12:50:44 np0005535656 augenrules[724]: rate_limit 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_limit 8192
Nov 25 12:50:44 np0005535656 augenrules[724]: lost 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog 0
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time 60000
Nov 25 12:50:44 np0005535656 augenrules[724]: backlog_wait_time_actual 0
Nov 25 12:50:44 np0005535656 systemd[1]: Started Security Auditing Service.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Rebuild Hardware Database.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 12:50:44 np0005535656 systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Update is Completed...
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Update is Completed.
Nov 25 12:50:44 np0005535656 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target System Initialization.
Nov 25 12:50:44 np0005535656 systemd[1]: Started dnf makecache --timer.
Nov 25 12:50:44 np0005535656 systemd[1]: Started Daily rotation of log files.
Nov 25 12:50:44 np0005535656 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target Timer Units.
Nov 25 12:50:44 np0005535656 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 25 12:50:44 np0005535656 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target Socket Units.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting D-Bus System Message Bus...
Nov 25 12:50:44 np0005535656 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Load Kernel Module configfs...
Nov 25 12:50:44 np0005535656 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Load Kernel Module configfs.
Nov 25 12:50:44 np0005535656 systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 12:50:44 np0005535656 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 25 12:50:44 np0005535656 systemd[1]: Started D-Bus System Message Bus.
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target Basic System.
Nov 25 12:50:44 np0005535656 dbus-broker-lau[766]: Ready
Nov 25 12:50:44 np0005535656 systemd[1]: Starting NTP client/server...
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 25 12:50:44 np0005535656 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 25 12:50:44 np0005535656 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 25 12:50:44 np0005535656 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 25 12:50:44 np0005535656 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 25 12:50:44 np0005535656 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 25 12:50:44 np0005535656 systemd[1]: Starting IPv4 firewall with iptables...
Nov 25 12:50:44 np0005535656 systemd[1]: Started irqbalance daemon.
Nov 25 12:50:44 np0005535656 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 25 12:50:44 np0005535656 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 12:50:44 np0005535656 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 12:50:44 np0005535656 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target sshd-keygen.target.
Nov 25 12:50:44 np0005535656 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 25 12:50:44 np0005535656 systemd[1]: Reached target User and Group Name Lookups.
Nov 25 12:50:44 np0005535656 systemd[1]: Starting User Login Management...
Nov 25 12:50:44 np0005535656 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 25 12:50:44 np0005535656 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 25 12:50:44 np0005535656 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 25 12:50:44 np0005535656 kernel: Console: switching to colour dummy device 80x25
Nov 25 12:50:44 np0005535656 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 25 12:50:44 np0005535656 kernel: [drm] features: -context_init
Nov 25 12:50:44 np0005535656 chronyd[802]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 12:50:44 np0005535656 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 25 12:50:44 np0005535656 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 25 12:50:44 np0005535656 chronyd[802]: Loaded 0 symmetric keys
Nov 25 12:50:44 np0005535656 chronyd[802]: Using right/UTC timezone to obtain leap second data
Nov 25 12:50:44 np0005535656 chronyd[802]: Loaded seccomp filter (level 2)
Nov 25 12:50:44 np0005535656 kernel: [drm] number of scanouts: 1
Nov 25 12:50:44 np0005535656 kernel: [drm] number of cap sets: 0
Nov 25 12:50:44 np0005535656 systemd[1]: Started NTP client/server.
Nov 25 12:50:44 np0005535656 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 25 12:50:44 np0005535656 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 25 12:50:44 np0005535656 kernel: Console: switching to colour frame buffer device 128x48
Nov 25 12:50:44 np0005535656 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 25 12:50:44 np0005535656 systemd-logind[788]: New seat seat0.
Nov 25 12:50:44 np0005535656 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 12:50:44 np0005535656 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 12:50:44 np0005535656 systemd[1]: Started User Login Management.
Nov 25 12:50:44 np0005535656 kernel: kvm_amd: TSC scaling supported
Nov 25 12:50:44 np0005535656 kernel: kvm_amd: Nested Virtualization enabled
Nov 25 12:50:44 np0005535656 kernel: kvm_amd: Nested Paging enabled
Nov 25 12:50:44 np0005535656 kernel: kvm_amd: LBR virtualization supported
Nov 25 12:50:44 np0005535656 iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Nov 25 12:50:44 np0005535656 systemd[1]: Finished IPv4 firewall with iptables.
Nov 25 12:50:45 np0005535656 cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 25 Nov 2025 17:50:45 +0000. Up 6.88 seconds.
Nov 25 12:50:45 np0005535656 systemd[1]: run-cloud\x2dinit-tmp-tmpavcp5tkt.mount: Deactivated successfully.
Nov 25 12:50:45 np0005535656 systemd[1]: Starting Hostname Service...
Nov 25 12:50:45 np0005535656 systemd[1]: Started Hostname Service.
Nov 25 12:50:45 np0005535656 systemd-hostnamed[855]: Hostname set to <np0005535656.novalocal> (static)
Nov 25 12:50:45 np0005535656 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 25 12:50:45 np0005535656 systemd[1]: Reached target Preparation for Network.
Nov 25 12:50:45 np0005535656 systemd[1]: Starting Network Manager...
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9256] NetworkManager (version 1.54.1-1.el9) is starting... (boot:95ac565f-f1e9-49d2-ac3b-e18fc9a1498e)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9262] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9415] manager[0x55e83631d080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9483] hostname: hostname: using hostnamed
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9484] hostname: static hostname changed from (none) to "np0005535656.novalocal"
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9492] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9638] manager[0x55e83631d080]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9639] manager[0x55e83631d080]: rfkill: WWAN hardware radio set enabled
Nov 25 12:50:45 np0005535656 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9733] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9733] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9734] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9735] manager: Networking is enabled by state file
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9737] settings: Loaded settings plugin: keyfile (internal)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9774] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9825] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9855] dhcp: init: Using DHCP client 'internal'
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9858] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9874] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9889] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9899] device (lo): Activation: starting connection 'lo' (ede43a25-bba5-487a-91f8-9e1a444321cb)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9911] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9916] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9970] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9974] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9977] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9978] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9980] device (eth0): carrier: link connected
Nov 25 12:50:45 np0005535656 NetworkManager[859]: <info>  [1764093045.9984] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 12:50:45 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0008] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 12:50:46 np0005535656 systemd[1]: Started Network Manager.
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0025] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0032] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0034] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0039] manager: NetworkManager state is now CONNECTING
Nov 25 12:50:46 np0005535656 systemd[1]: Reached target Network.
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0042] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0059] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0064] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:50:46 np0005535656 systemd[1]: Starting Network Manager Wait Online...
Nov 25 12:50:46 np0005535656 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 25 12:50:46 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0246] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0250] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 12:50:46 np0005535656 NetworkManager[859]: <info>  [1764093046.0261] device (lo): Activation: successful, device activated.
Nov 25 12:50:46 np0005535656 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 25 12:50:46 np0005535656 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 12:50:46 np0005535656 systemd[1]: Reached target NFS client services.
Nov 25 12:50:46 np0005535656 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 12:50:46 np0005535656 systemd[1]: Reached target Remote File Systems.
Nov 25 12:50:46 np0005535656 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7095] dhcp4 (eth0): state changed new lease, address=38.102.83.203
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7113] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7134] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7178] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7180] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7182] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7185] device (eth0): Activation: successful, device activated.
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7190] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 12:50:47 np0005535656 NetworkManager[859]: <info>  [1764093047.7192] manager: startup complete
Nov 25 12:50:47 np0005535656 systemd[1]: Finished Network Manager Wait Online.
Nov 25 12:50:47 np0005535656 systemd[1]: Starting Cloud-init: Network Stage...
Nov 25 12:50:48 np0005535656 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 25 Nov 2025 17:50:48 +0000. Up 9.59 seconds.
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.203         | 255.255.255.0 | global | fa:16:3e:34:8f:e4 |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe34:8fe4/64 |       .       |  link  | fa:16:3e:34:8f:e4 |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Nov 25 12:50:48 np0005535656 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 12:50:49 np0005535656 cloud-init[922]: Generating public/private rsa key pair.
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key fingerprint is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: SHA256:ff7EUFgjxL9AGF2djUCeA8/04l+t1qONGjTyech9KbE root@np0005535656.novalocal
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key's randomart image is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: +---[RSA 3072]----+
Nov 25 12:50:49 np0005535656 cloud-init[922]: |          oB*o+oo|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |          .*+*.oo|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |           .O.o  |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |         . ..+. .|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |        S..o+o .o|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |          =o=+++.|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |           *.E*+.|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |            o+* .|
Nov 25 12:50:49 np0005535656 cloud-init[922]: |           ..o.. |
Nov 25 12:50:49 np0005535656 cloud-init[922]: +----[SHA256]-----+
Nov 25 12:50:49 np0005535656 cloud-init[922]: Generating public/private ecdsa key pair.
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key fingerprint is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: SHA256:1Qw7sg9atGolgf72b6okmkDkoCL8boRI1ZF7UPd00EE root@np0005535656.novalocal
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key's randomart image is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: +---[ECDSA 256]---+
Nov 25 12:50:49 np0005535656 cloud-init[922]: |    ..+. ..o+E.  |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |   . =  . o=..   |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |... . + o +.o    |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |*. . . + = .     |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |*+. . o S        |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |=... . * o       |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |. ... B   .      |
Nov 25 12:50:49 np0005535656 cloud-init[922]: | ..+ = .  .      |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |  +.  ..o+.      |
Nov 25 12:50:49 np0005535656 cloud-init[922]: +----[SHA256]-----+
Nov 25 12:50:49 np0005535656 cloud-init[922]: Generating public/private ed25519 key pair.
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 25 12:50:49 np0005535656 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key fingerprint is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: SHA256:Mi2gGDCoLzeFPJHKTMeRIz+GJHToVtQLpthIP8Kfy6E root@np0005535656.novalocal
Nov 25 12:50:49 np0005535656 cloud-init[922]: The key's randomart image is:
Nov 25 12:50:49 np0005535656 cloud-init[922]: +--[ED25519 256]--+
Nov 25 12:50:49 np0005535656 cloud-init[922]: |=.+=+            |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |+B+O .           |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |&=Xo+ .          |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |=%=B.o .         |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |o.=o+ + S        |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |. ++   +         |
Nov 25 12:50:49 np0005535656 cloud-init[922]: | oo.o            |
Nov 25 12:50:49 np0005535656 cloud-init[922]: | E o             |
Nov 25 12:50:49 np0005535656 cloud-init[922]: |                 |
Nov 25 12:50:49 np0005535656 cloud-init[922]: +----[SHA256]-----+
Nov 25 12:50:49 np0005535656 sm-notify[1007]: Version 2.5.4 starting
Nov 25 12:50:49 np0005535656 systemd[1]: Finished Cloud-init: Network Stage.
Nov 25 12:50:49 np0005535656 systemd[1]: Reached target Cloud-config availability.
Nov 25 12:50:49 np0005535656 systemd[1]: Reached target Network is Online.
Nov 25 12:50:49 np0005535656 systemd[1]: Starting Cloud-init: Config Stage...
Nov 25 12:50:49 np0005535656 systemd[1]: Starting Crash recovery kernel arming...
Nov 25 12:50:49 np0005535656 systemd[1]: Starting Notify NFS peers of a restart...
Nov 25 12:50:49 np0005535656 systemd[1]: Starting System Logging Service...
Nov 25 12:50:49 np0005535656 systemd[1]: Starting OpenSSH server daemon...
Nov 25 12:50:49 np0005535656 systemd[1]: Starting Permit User Sessions...
Nov 25 12:50:49 np0005535656 systemd[1]: Started Notify NFS peers of a restart.
Nov 25 12:50:49 np0005535656 systemd[1]: Started OpenSSH server daemon.
Nov 25 12:50:49 np0005535656 systemd[1]: Finished Permit User Sessions.
Nov 25 12:50:49 np0005535656 systemd[1]: Started Command Scheduler.
Nov 25 12:50:49 np0005535656 systemd[1]: Started Getty on tty1.
Nov 25 12:50:49 np0005535656 systemd[1]: Started Serial Getty on ttyS0.
Nov 25 12:50:49 np0005535656 systemd[1]: Reached target Login Prompts.
Nov 25 12:50:49 np0005535656 rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Nov 25 12:50:49 np0005535656 rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 25 12:50:49 np0005535656 systemd[1]: Started System Logging Service.
Nov 25 12:50:49 np0005535656 systemd[1]: Reached target Multi-User System.
Nov 25 12:50:49 np0005535656 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 25 12:50:49 np0005535656 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 25 12:50:49 np0005535656 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 25 12:50:49 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 12:50:50 np0005535656 kdumpctl[1017]: kdump: No kdump initial ramdisk found.
Nov 25 12:50:50 np0005535656 kdumpctl[1017]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 25 12:50:50 np0005535656 cloud-init[1103]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 25 Nov 2025 17:50:50 +0000. Up 11.63 seconds.
Nov 25 12:50:50 np0005535656 systemd[1]: Finished Cloud-init: Config Stage.
Nov 25 12:50:50 np0005535656 systemd[1]: Starting Cloud-init: Final Stage...
Nov 25 12:50:50 np0005535656 cloud-init[1265]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 25 Nov 2025 17:50:50 +0000. Up 12.06 seconds.
Nov 25 12:50:50 np0005535656 dracut[1272]: dracut-057-102.git20250818.el9
Nov 25 12:50:50 np0005535656 cloud-init[1284]: #############################################################
Nov 25 12:50:50 np0005535656 cloud-init[1289]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 25 12:50:50 np0005535656 cloud-init[1292]: 256 SHA256:1Qw7sg9atGolgf72b6okmkDkoCL8boRI1ZF7UPd00EE root@np0005535656.novalocal (ECDSA)
Nov 25 12:50:50 np0005535656 cloud-init[1294]: 256 SHA256:Mi2gGDCoLzeFPJHKTMeRIz+GJHToVtQLpthIP8Kfy6E root@np0005535656.novalocal (ED25519)
Nov 25 12:50:50 np0005535656 cloud-init[1296]: 3072 SHA256:ff7EUFgjxL9AGF2djUCeA8/04l+t1qONGjTyech9KbE root@np0005535656.novalocal (RSA)
Nov 25 12:50:50 np0005535656 cloud-init[1297]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 25 12:50:50 np0005535656 cloud-init[1298]: #############################################################
Nov 25 12:50:50 np0005535656 cloud-init[1265]: Cloud-init v. 24.4-7.el9 finished at Tue, 25 Nov 2025 17:50:50 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.25 seconds
Nov 25 12:50:50 np0005535656 systemd[1]: Finished Cloud-init: Final Stage.
Nov 25 12:50:50 np0005535656 dracut[1274]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 25 12:50:50 np0005535656 systemd[1]: Reached target Cloud-init target.
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 12:50:51 np0005535656 dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: memstrack is not available
Nov 25 12:50:52 np0005535656 dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 12:50:52 np0005535656 dracut[1274]: memstrack is not available
Nov 25 12:50:52 np0005535656 dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 12:50:52 np0005535656 chronyd[802]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Nov 25 12:50:52 np0005535656 chronyd[802]: System clock TAI offset set to 37 seconds
Nov 25 12:50:52 np0005535656 dracut[1274]: *** Including module: systemd ***
Nov 25 12:50:53 np0005535656 dracut[1274]: *** Including module: fips ***
Nov 25 12:50:53 np0005535656 dracut[1274]: *** Including module: systemd-initrd ***
Nov 25 12:50:53 np0005535656 dracut[1274]: *** Including module: i18n ***
Nov 25 12:50:53 np0005535656 dracut[1274]: *** Including module: drm ***
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 25 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 31 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 28 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 32 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 30 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 25 12:50:54 np0005535656 irqbalance[784]: IRQ 29 affinity is now unmanaged
Nov 25 12:50:54 np0005535656 dracut[1274]: *** Including module: prefixdevname ***
Nov 25 12:50:54 np0005535656 dracut[1274]: *** Including module: kernel-modules ***
Nov 25 12:50:54 np0005535656 kernel: block vda: the capability attribute has been deprecated.
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: kernel-modules-extra ***
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: qemu ***
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: fstab-sys ***
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: rootfs-block ***
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: terminfo ***
Nov 25 12:50:55 np0005535656 dracut[1274]: *** Including module: udev-rules ***
Nov 25 12:50:56 np0005535656 dracut[1274]: Skipping udev rule: 91-permissions.rules
Nov 25 12:50:56 np0005535656 dracut[1274]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 25 12:50:56 np0005535656 dracut[1274]: *** Including module: virtiofs ***
Nov 25 12:50:56 np0005535656 dracut[1274]: *** Including module: dracut-systemd ***
Nov 25 12:50:56 np0005535656 dracut[1274]: *** Including module: usrmount ***
Nov 25 12:50:56 np0005535656 dracut[1274]: *** Including module: base ***
Nov 25 12:50:56 np0005535656 dracut[1274]: *** Including module: fs-lib ***
Nov 25 12:50:57 np0005535656 dracut[1274]: *** Including module: kdumpbase ***
Nov 25 12:50:57 np0005535656 dracut[1274]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 25 12:50:57 np0005535656 dracut[1274]:  microcode_ctl module: mangling fw_dir
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 25 12:50:57 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 25 12:50:57 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 25 12:50:58 np0005535656 dracut[1274]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 25 12:50:58 np0005535656 dracut[1274]: *** Including module: openssl ***
Nov 25 12:50:58 np0005535656 dracut[1274]: *** Including module: shutdown ***
Nov 25 12:50:58 np0005535656 dracut[1274]: *** Including module: squash ***
Nov 25 12:50:58 np0005535656 dracut[1274]: *** Including modules done ***
Nov 25 12:50:58 np0005535656 dracut[1274]: *** Installing kernel module dependencies ***
Nov 25 12:50:59 np0005535656 dracut[1274]: *** Installing kernel module dependencies done ***
Nov 25 12:50:59 np0005535656 dracut[1274]: *** Resolving executable dependencies ***
Nov 25 12:51:01 np0005535656 dracut[1274]: *** Resolving executable dependencies done ***
Nov 25 12:51:01 np0005535656 dracut[1274]: *** Generating early-microcode cpio image ***
Nov 25 12:51:01 np0005535656 dracut[1274]: *** Store current command line parameters ***
Nov 25 12:51:01 np0005535656 dracut[1274]: Stored kernel commandline:
Nov 25 12:51:01 np0005535656 dracut[1274]: No dracut internal kernel commandline stored in the initramfs
Nov 25 12:51:01 np0005535656 dracut[1274]: *** Install squash loader ***
Nov 25 12:51:02 np0005535656 dracut[1274]: *** Squashing the files inside the initramfs ***
Nov 25 12:51:03 np0005535656 dracut[1274]: *** Squashing the files inside the initramfs done ***
Nov 25 12:51:03 np0005535656 dracut[1274]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 25 12:51:03 np0005535656 dracut[1274]: *** Hardlinking files ***
Nov 25 12:51:03 np0005535656 dracut[1274]: *** Hardlinking files done ***
Nov 25 12:51:04 np0005535656 dracut[1274]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 25 12:51:04 np0005535656 kdumpctl[1017]: kdump: kexec: loaded kdump kernel
Nov 25 12:51:04 np0005535656 kdumpctl[1017]: kdump: Starting kdump: [OK]
Nov 25 12:51:04 np0005535656 systemd[1]: Finished Crash recovery kernel arming.
Nov 25 12:51:04 np0005535656 systemd[1]: Startup finished in 1.510s (kernel) + 2.850s (initrd) + 21.933s (userspace) = 26.293s.
Nov 25 12:51:08 np0005535656 systemd[1]: Created slice User Slice of UID 1000.
Nov 25 12:51:08 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 25 12:51:08 np0005535656 systemd-logind[788]: New session 1 of user zuul.
Nov 25 12:51:08 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 25 12:51:08 np0005535656 systemd[1]: Starting User Manager for UID 1000...
Nov 25 12:51:09 np0005535656 systemd[4302]: Queued start job for default target Main User Target.
Nov 25 12:51:09 np0005535656 systemd[4302]: Created slice User Application Slice.
Nov 25 12:51:09 np0005535656 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 12:51:09 np0005535656 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 12:51:09 np0005535656 systemd[4302]: Reached target Paths.
Nov 25 12:51:09 np0005535656 systemd[4302]: Reached target Timers.
Nov 25 12:51:09 np0005535656 systemd[4302]: Starting D-Bus User Message Bus Socket...
Nov 25 12:51:09 np0005535656 systemd[4302]: Starting Create User's Volatile Files and Directories...
Nov 25 12:51:09 np0005535656 systemd[4302]: Finished Create User's Volatile Files and Directories.
Nov 25 12:51:09 np0005535656 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Nov 25 12:51:09 np0005535656 systemd[4302]: Reached target Sockets.
Nov 25 12:51:09 np0005535656 systemd[4302]: Reached target Basic System.
Nov 25 12:51:09 np0005535656 systemd[4302]: Reached target Main User Target.
Nov 25 12:51:09 np0005535656 systemd[4302]: Startup finished in 170ms.
Nov 25 12:51:09 np0005535656 systemd[1]: Started User Manager for UID 1000.
Nov 25 12:51:09 np0005535656 systemd[1]: Started Session 1 of User zuul.
Nov 25 12:51:09 np0005535656 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 12:51:13 np0005535656 python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 12:51:15 np0005535656 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 12:51:21 np0005535656 python3[4472]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 12:51:21 np0005535656 python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 25 12:51:23 np0005535656 python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDW1XCpzrQ9MW6obIeOVzwK4Ru1PKYXLo1+iGKWtJMArDc5R9SwunIVjbyOOixrwKgjL3jhUXWIEKUJblfKkNcH6M2JT6JDJrx528lez51b5kNCsw6SkNuo0r9NpobMhaaBUoEYiUz3SyUax8DPkfswX0VgfzzbMh49kkazqTa1SDQ6Tw4pxgMM2+NLJ37dFHTFnacuwJA5/kbdbAknO0GONZgHJF6heJG7mxvfKERMa2wNoGU4PCALeBbbrz2jT4OPYsfJPrhIKBN2NNCJNjK/9lo4pX8hrAgwVpaCMPOc+jCrWH4aSRSfWQEfq2iJJ+M/Cfeupn35GsnQBBSNI48D72TpMVDIQBrLqRv3DUfT1cbjzKUZfBuzubVOZl1SiHycthCBB8unG/XY516aIhubkkm9/mRJT3WKR16XaRTOTBo5kqjco+giHtqkHmtPTj9y4a/3s2LeBLVwUVRwy0clvzJO17PJs5Np0Fa07nKpFosPyrLFu2sR+nx+NePiNWE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:24 np0005535656 python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:24 np0005535656 python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:25 np0005535656 python3[4732]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764093084.5573967-230-198266329363772/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=24b34a4fdc57493e98e14ca7d8b1c09b_id_rsa follow=False checksum=a7d84697a8a4fb166867397903d4e3bf3798ba94 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:26 np0005535656 python3[4855]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:26 np0005535656 python3[4926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764093085.6553216-274-188498571213201/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=24b34a4fdc57493e98e14ca7d8b1c09b_id_rsa.pub follow=False checksum=134b94b25a470032a64a4ed24563bbdf96d7e9ae backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:27 np0005535656 python3[4974]: ansible-ping Invoked with data=pong
Nov 25 12:51:28 np0005535656 python3[4998]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 12:51:31 np0005535656 python3[5056]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 25 12:51:32 np0005535656 python3[5088]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:32 np0005535656 python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:33 np0005535656 python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:33 np0005535656 python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:33 np0005535656 python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:33 np0005535656 python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:34 np0005535656 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 25 12:51:34 np0005535656 irqbalance[784]: IRQ 26 affinity is now unmanaged
Nov 25 12:51:35 np0005535656 python3[5234]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:36 np0005535656 python3[5312]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:36 np0005535656 python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764093095.8458138-27-157757397115096/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:37 np0005535656 python3[5433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:37 np0005535656 python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:38 np0005535656 python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:38 np0005535656 python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:38 np0005535656 python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:38 np0005535656 python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:39 np0005535656 python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:39 np0005535656 python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:39 np0005535656 python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:40 np0005535656 python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:40 np0005535656 python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:40 np0005535656 python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:41 np0005535656 python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:41 np0005535656 python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:41 np0005535656 python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:41 np0005535656 python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:42 np0005535656 python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:42 np0005535656 python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:42 np0005535656 python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:43 np0005535656 python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:43 np0005535656 python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:43 np0005535656 python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:44 np0005535656 python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:44 np0005535656 python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:44 np0005535656 python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:44 np0005535656 python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 12:51:47 np0005535656 python3[6059]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 12:51:47 np0005535656 systemd[1]: Starting Time & Date Service...
Nov 25 12:51:47 np0005535656 systemd[1]: Started Time & Date Service.
Nov 25 12:51:47 np0005535656 systemd-timedated[6061]: Changed time zone to 'UTC' (UTC).
Nov 25 12:51:48 np0005535656 python3[6090]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:49 np0005535656 python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:49 np0005535656 python3[6237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764093109.1369772-203-147691684934383/source _original_basename=tmpfe9c0pdr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:50 np0005535656 python3[6337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:50 np0005535656 python3[6408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764093110.1134622-244-36686436328552/source _original_basename=tmpjggwjcq9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:51 np0005535656 python3[6510]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:52 np0005535656 python3[6583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764093111.276667-307-102254826825627/source _original_basename=tmpbdyknv0d follow=False checksum=b142a25af330be1fcee690876fc8170e6e08af8f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:52 np0005535656 python3[6631]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:51:52 np0005535656 python3[6657]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:51:53 np0005535656 python3[6737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:51:53 np0005535656 python3[6810]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764093113.0617175-363-201933721461340/source _original_basename=tmp19ne91us follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:51:54 np0005535656 python3[6861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-3b0c-095e-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:51:55 np0005535656 python3[6889]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-3b0c-095e-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 25 12:51:56 np0005535656 python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:52:17 np0005535656 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 12:52:27 np0005535656 python3[6946]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:53:27 np0005535656 systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 25 12:53:33 np0005535656 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 25 12:53:33 np0005535656 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.4964] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 12:53:33 np0005535656 systemd-udevd[6947]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5131] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5153] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5157] device (eth1): carrier: link connected
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5159] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5163] policy: auto-activating connection 'Wired connection 1' (d1311955-ac5a-3c9f-94f4-bbb7d8b3ac9e)
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5167] device (eth1): Activation: starting connection 'Wired connection 1' (d1311955-ac5a-3c9f-94f4-bbb7d8b3ac9e)
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5167] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5170] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5173] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 12:53:33 np0005535656 NetworkManager[859]: <info>  [1764093213.5177] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:53:33 np0005535656 systemd[4302]: Starting Mark boot as successful...
Nov 25 12:53:33 np0005535656 systemd[4302]: Finished Mark boot as successful.
Nov 25 12:53:34 np0005535656 systemd-logind[788]: New session 3 of user zuul.
Nov 25 12:53:34 np0005535656 systemd[1]: Started Session 3 of User zuul.
Nov 25 12:53:34 np0005535656 python3[6979]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-ec10-f69c-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:53:41 np0005535656 python3[7059]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:53:42 np0005535656 python3[7132]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764093221.4505506-154-55030086296801/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=a0e2c0158e3501e79d5ea1d841f1b8e232d0d7a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:53:42 np0005535656 python3[7182]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 12:53:42 np0005535656 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 12:53:42 np0005535656 systemd[1]: Stopped Network Manager Wait Online.
Nov 25 12:53:42 np0005535656 systemd[1]: Stopping Network Manager Wait Online...
Nov 25 12:53:42 np0005535656 systemd[1]: Stopping Network Manager...
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9397] caught SIGTERM, shutting down normally.
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9412] dhcp4 (eth0): canceled DHCP transaction
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9413] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9413] dhcp4 (eth0): state changed no lease
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9417] manager: NetworkManager state is now CONNECTING
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9594] dhcp4 (eth1): canceled DHCP transaction
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9595] dhcp4 (eth1): state changed no lease
Nov 25 12:53:42 np0005535656 NetworkManager[859]: <info>  [1764093222.9647] exiting (success)
Nov 25 12:53:42 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 12:53:42 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 12:53:42 np0005535656 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 12:53:42 np0005535656 systemd[1]: Stopped Network Manager.
Nov 25 12:53:42 np0005535656 systemd[1]: NetworkManager.service: Consumed 1.340s CPU time, 10.1M memory peak.
Nov 25 12:53:43 np0005535656 systemd[1]: Starting Network Manager...
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.0569] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:95ac565f-f1e9-49d2-ac3b-e18fc9a1498e)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.0570] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.0644] manager[0x55da7ab30070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 12:53:43 np0005535656 systemd[1]: Starting Hostname Service...
Nov 25 12:53:43 np0005535656 systemd[1]: Started Hostname Service.
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1514] hostname: hostname: using hostnamed
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1515] hostname: static hostname changed from (none) to "np0005535656.novalocal"
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1519] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1523] manager[0x55da7ab30070]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1523] manager[0x55da7ab30070]: rfkill: WWAN hardware radio set enabled
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1550] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1550] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1551] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1551] manager: Networking is enabled by state file
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1554] settings: Loaded settings plugin: keyfile (internal)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1558] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1580] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1590] dhcp: init: Using DHCP client 'internal'
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1592] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1597] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1602] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1609] device (lo): Activation: starting connection 'lo' (ede43a25-bba5-487a-91f8-9e1a444321cb)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1615] device (eth0): carrier: link connected
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1619] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1622] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1623] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1628] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1636] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1642] device (eth1): carrier: link connected
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1645] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1650] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (d1311955-ac5a-3c9f-94f4-bbb7d8b3ac9e) (indicated)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1650] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1656] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1663] device (eth1): Activation: starting connection 'Wired connection 1' (d1311955-ac5a-3c9f-94f4-bbb7d8b3ac9e)
Nov 25 12:53:43 np0005535656 systemd[1]: Started Network Manager.
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1673] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1678] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1681] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1682] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1685] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1689] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1692] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1695] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1699] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1707] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1710] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1727] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1732] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1761] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1769] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1781] device (lo): Activation: successful, device activated.
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1797] dhcp4 (eth0): state changed new lease, address=38.102.83.203
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1811] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 12:53:43 np0005535656 systemd[1]: Starting Network Manager Wait Online...
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1886] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1919] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1921] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1927] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1931] device (eth0): Activation: successful, device activated.
Nov 25 12:53:43 np0005535656 NetworkManager[7191]: <info>  [1764093223.1937] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 12:53:43 np0005535656 python3[7266]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-ec10-f69c-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:53:53 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 12:54:13 np0005535656 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4199] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 12:54:28 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 12:54:28 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4487] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4489] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4497] device (eth1): Activation: successful, device activated.
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4505] manager: startup complete
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4509] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <warn>  [1764093268.4514] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4523] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 systemd[1]: Finished Network Manager Wait Online.
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4675] dhcp4 (eth1): canceled DHCP transaction
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4675] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4675] dhcp4 (eth1): state changed no lease
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4690] policy: auto-activating connection 'ci-private-network' (4510c44e-1ef2-583b-8bc0-90ffe3477715)
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4695] device (eth1): Activation: starting connection 'ci-private-network' (4510c44e-1ef2-583b-8bc0-90ffe3477715)
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4696] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4698] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4713] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4724] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4763] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4765] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 12:54:28 np0005535656 NetworkManager[7191]: <info>  [1764093268.4773] device (eth1): Activation: successful, device activated.
Nov 25 12:54:38 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 12:54:43 np0005535656 systemd[1]: session-3.scope: Deactivated successfully.
Nov 25 12:54:43 np0005535656 systemd[1]: session-3.scope: Consumed 1.870s CPU time.
Nov 25 12:54:43 np0005535656 systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Nov 25 12:54:43 np0005535656 systemd-logind[788]: Removed session 3.
Nov 25 12:54:51 np0005535656 systemd-logind[788]: New session 4 of user zuul.
Nov 25 12:54:51 np0005535656 systemd[1]: Started Session 4 of User zuul.
Nov 25 12:54:52 np0005535656 python3[7379]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 12:54:52 np0005535656 python3[7452]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764093291.9359677-312-116819330988134/source _original_basename=tmpnzum5gvs follow=False checksum=7e4d0587608df8561be74ab1aa912d06de9c85bf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:54:55 np0005535656 systemd[1]: session-4.scope: Deactivated successfully.
Nov 25 12:54:55 np0005535656 systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Nov 25 12:54:55 np0005535656 systemd-logind[788]: Removed session 4.
Nov 25 12:56:44 np0005535656 systemd[4302]: Created slice User Background Tasks Slice.
Nov 25 12:56:44 np0005535656 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Nov 25 12:56:44 np0005535656 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Nov 25 12:59:57 np0005535656 systemd-logind[788]: New session 5 of user zuul.
Nov 25 12:59:57 np0005535656 systemd[1]: Started Session 5 of User zuul.
Nov 25 12:59:57 np0005535656 python3[7514]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-ff49-f2d4-000000000c97-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 12:59:58 np0005535656 python3[7542]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:59:58 np0005535656 python3[7569]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:59:58 np0005535656 python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:59:58 np0005535656 python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:59:59 np0005535656 python3[7647]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 12:59:59 np0005535656 python3[7725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:00:00 np0005535656 python3[7798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764093599.5051332-343-114569036210273/source _original_basename=tmpew7z36qv follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:00:01 np0005535656 python3[7848]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:00:01 np0005535656 systemd[1]: Reloading.
Nov 25 13:00:01 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:00:02 np0005535656 python3[7904]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 25 13:00:03 np0005535656 python3[7930]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:00:03 np0005535656 python3[7958]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:00:04 np0005535656 python3[7986]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:00:04 np0005535656 python3[8014]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:00:04 np0005535656 python3[8041]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-ff49-f2d4-000000000c9e-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:00:05 np0005535656 python3[8071]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 25 13:00:08 np0005535656 systemd[1]: session-5.scope: Deactivated successfully.
Nov 25 13:00:08 np0005535656 systemd[1]: session-5.scope: Consumed 4.382s CPU time.
Nov 25 13:00:08 np0005535656 systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Nov 25 13:00:08 np0005535656 systemd-logind[788]: Removed session 5.
Nov 25 13:00:09 np0005535656 systemd-logind[788]: New session 6 of user zuul.
Nov 25 13:00:09 np0005535656 systemd[1]: Started Session 6 of User zuul.
Nov 25 13:00:10 np0005535656 python3[8104]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 25 13:00:34 np0005535656 kernel: SELinux:  Converting 385 SID table entries...
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:00:34 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  Converting 385 SID table entries...
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:00:43 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  Converting 385 SID table entries...
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:00:52 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:00:53 np0005535656 setsebool[8170]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 25 13:00:53 np0005535656 setsebool[8170]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 25 13:01:04 np0005535656 kernel: SELinux:  Converting 389 SID table entries...
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:01:04 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:01:24 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 13:01:24 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:01:24 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:01:24 np0005535656 systemd[1]: Reloading.
Nov 25 13:01:24 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:01:24 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:01:34 np0005535656 irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 25 13:01:34 np0005535656 irqbalance[784]: IRQ 27 affinity is now unmanaged
Nov 25 13:01:41 np0005535656 python3[17319]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-2924-e321-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:01:42 np0005535656 kernel: evm: overlay not supported
Nov 25 13:01:42 np0005535656 systemd[4302]: Starting D-Bus User Message Bus...
Nov 25 13:01:42 np0005535656 dbus-broker-launch[17725]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 25 13:01:42 np0005535656 dbus-broker-launch[17725]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 25 13:01:42 np0005535656 systemd[4302]: Started D-Bus User Message Bus.
Nov 25 13:01:42 np0005535656 dbus-broker-lau[17725]: Ready
Nov 25 13:01:42 np0005535656 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 13:01:42 np0005535656 systemd[4302]: Created slice Slice /user.
Nov 25 13:01:42 np0005535656 systemd[4302]: podman-17650.scope: unit configures an IP firewall, but not running as root.
Nov 25 13:01:42 np0005535656 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Nov 25 13:01:42 np0005535656 systemd[4302]: Started podman-17650.scope.
Nov 25 13:01:42 np0005535656 systemd[4302]: Started podman-pause-ae4778c5.scope.
Nov 25 13:01:43 np0005535656 python3[18017]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:01:43 np0005535656 python3[18017]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 25 13:01:43 np0005535656 systemd[1]: session-6.scope: Deactivated successfully.
Nov 25 13:01:43 np0005535656 systemd[1]: session-6.scope: Consumed 1min 1.876s CPU time.
Nov 25 13:01:43 np0005535656 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Nov 25 13:01:43 np0005535656 systemd-logind[788]: Removed session 6.
Nov 25 13:02:07 np0005535656 systemd-logind[788]: New session 7 of user zuul.
Nov 25 13:02:07 np0005535656 systemd[1]: Started Session 7 of User zuul.
Nov 25 13:02:07 np0005535656 python3[27826]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGb5C/GZ94PaMWGg5u6/8+QmwMgail1fNueCu66aoW8/kDvxceETImWIGwCyxjIgAigJdKZ6i1RoTtXFCcAaNNk= zuul@np0005535654.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 13:02:07 np0005535656 python3[27999]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGb5C/GZ94PaMWGg5u6/8+QmwMgail1fNueCu66aoW8/kDvxceETImWIGwCyxjIgAigJdKZ6i1RoTtXFCcAaNNk= zuul@np0005535654.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 13:02:08 np0005535656 python3[28394]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005535656.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 25 13:02:09 np0005535656 python3[28642]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGb5C/GZ94PaMWGg5u6/8+QmwMgail1fNueCu66aoW8/kDvxceETImWIGwCyxjIgAigJdKZ6i1RoTtXFCcAaNNk= zuul@np0005535654.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 13:02:09 np0005535656 python3[28934]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:02:10 np0005535656 python3[29182]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764093729.2588773-152-37479491230320/source _original_basename=tmpnj1n170g follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:02:10 np0005535656 python3[29586]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 25 13:02:10 np0005535656 systemd[1]: Starting Hostname Service...
Nov 25 13:02:11 np0005535656 systemd[1]: Started Hostname Service.
Nov 25 13:02:11 np0005535656 systemd-hostnamed[29693]: Changed pretty hostname to 'compute-1'
Nov 25 13:02:11 np0005535656 systemd-hostnamed[29693]: Hostname set to <compute-1> (static)
Nov 25 13:02:11 np0005535656 NetworkManager[7191]: <info>  [1764093731.0549] hostname: static hostname changed from "np0005535656.novalocal" to "compute-1"
Nov 25 13:02:11 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 13:02:11 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 13:02:11 np0005535656 systemd[1]: session-7.scope: Deactivated successfully.
Nov 25 13:02:11 np0005535656 systemd[1]: session-7.scope: Consumed 2.240s CPU time.
Nov 25 13:02:11 np0005535656 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Nov 25 13:02:11 np0005535656 systemd-logind[788]: Removed session 7.
Nov 25 13:02:11 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:02:11 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:02:11 np0005535656 systemd[1]: man-db-cache-update.service: Consumed 56.621s CPU time.
Nov 25 13:02:11 np0005535656 systemd[1]: run-r8c092c5ca5024dee83f7706027abdf39.service: Deactivated successfully.
Nov 25 13:02:21 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 13:02:41 np0005535656 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 13:05:44 np0005535656 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 25 13:05:44 np0005535656 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 25 13:05:44 np0005535656 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 25 13:05:44 np0005535656 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 25 13:07:20 np0005535656 systemd-logind[788]: New session 8 of user zuul.
Nov 25 13:07:20 np0005535656 systemd[1]: Started Session 8 of User zuul.
Nov 25 13:07:21 np0005535656 python3[30029]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:07:22 np0005535656 python3[30145]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:23 np0005535656 python3[30218]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:23 np0005535656 python3[30244]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:23 np0005535656 python3[30317]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:24 np0005535656 python3[30343]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:24 np0005535656 python3[30416]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:24 np0005535656 python3[30442]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:25 np0005535656 python3[30515]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:25 np0005535656 python3[30541]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:25 np0005535656 python3[30614]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:25 np0005535656 python3[30640]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:26 np0005535656 python3[30713]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:07:26 np0005535656 python3[30739]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 13:07:26 np0005535656 python3[30812]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094042.4838886-33874-227350890676278/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:09:59 np0005535656 python3[30864]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:14:59 np0005535656 systemd[1]: session-8.scope: Deactivated successfully.
Nov 25 13:14:59 np0005535656 systemd[1]: session-8.scope: Consumed 5.070s CPU time.
Nov 25 13:14:59 np0005535656 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Nov 25 13:14:59 np0005535656 systemd-logind[788]: Removed session 8.
Nov 25 13:23:09 np0005535656 systemd-logind[788]: New session 9 of user zuul.
Nov 25 13:23:09 np0005535656 systemd[1]: Started Session 9 of User zuul.
Nov 25 13:23:10 np0005535656 python3.9[31032]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:23:11 np0005535656 python3.9[31213]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:23:20 np0005535656 systemd[1]: session-9.scope: Deactivated successfully.
Nov 25 13:23:20 np0005535656 systemd[1]: session-9.scope: Consumed 7.859s CPU time.
Nov 25 13:23:20 np0005535656 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Nov 25 13:23:20 np0005535656 systemd-logind[788]: Removed session 9.
Nov 25 13:23:25 np0005535656 systemd-logind[788]: New session 10 of user zuul.
Nov 25 13:23:25 np0005535656 systemd[1]: Started Session 10 of User zuul.
Nov 25 13:23:27 np0005535656 python3.9[31423]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:23:27 np0005535656 systemd[1]: session-10.scope: Deactivated successfully.
Nov 25 13:23:27 np0005535656 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Nov 25 13:23:27 np0005535656 systemd-logind[788]: Removed session 10.
Nov 25 13:23:43 np0005535656 systemd-logind[788]: New session 11 of user zuul.
Nov 25 13:23:43 np0005535656 systemd[1]: Started Session 11 of User zuul.
Nov 25 13:23:44 np0005535656 python3.9[31607]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 25 13:23:45 np0005535656 python3.9[31781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:23:46 np0005535656 python3.9[31933]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:23:47 np0005535656 python3.9[32086]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:23:48 np0005535656 python3.9[32238]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:23:49 np0005535656 python3.9[32390]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:23:49 np0005535656 python3.9[32513]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095028.64633-131-8852889193196/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:23:50 np0005535656 python3.9[32665]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:23:51 np0005535656 python3.9[32821]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:23:52 np0005535656 python3.9[32973]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:23:53 np0005535656 python3.9[33123]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:23:58 np0005535656 python3.9[33376]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:23:58 np0005535656 python3.9[33526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:24:00 np0005535656 python3.9[33680]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:24:01 np0005535656 python3.9[33838]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:24:02 np0005535656 python3.9[33922]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:24:49 np0005535656 systemd[1]: Reloading.
Nov 25 13:24:49 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:24:49 np0005535656 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 25 13:24:49 np0005535656 systemd[1]: Reloading.
Nov 25 13:24:50 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:24:50 np0005535656 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 25 13:24:50 np0005535656 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 25 13:24:50 np0005535656 systemd[1]: Reloading.
Nov 25 13:24:50 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:24:50 np0005535656 systemd[1]: Starting dnf makecache...
Nov 25 13:24:50 np0005535656 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 25 13:24:50 np0005535656 dnf[34205]: Failed determining last makecache time.
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-barbican-42b4c41831408a8e323 140 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 176 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-cinder-1c00d6490d88e436f26ef 177 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-stevedore-c4acc5639fd2329372142 171 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-observabilityclient-2f31846d73c 198 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-os-net-config-bbae2ed8a159b0435a473f38 194 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 177 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-designate-tests-tempest-347fdbc 182 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-glance-1fd12c29b339f30fe823e 186 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 171 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-manila-3c01b7181572c95dac462 189 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-whitebox-neutron-tests-tempest- 194 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-octavia-ba397f07a7331190208c 182 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-watcher-c014f81a8647287f6dcc 183 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-tcib-1124124ec06aadbac34f0d340b 188 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 188 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-swift-dc98a8463506ac520c469a 173 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-python-tempestconf-8515371b7cceebd4282 186 kB/s | 3.0 kB     00:00
Nov 25 13:24:50 np0005535656 dnf[34205]: delorean-openstack-heat-ui-013accbfd179753bc3f0 169 kB/s | 3.0 kB     00:00
Nov 25 13:24:51 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:24:51 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:24:51 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:24:51 np0005535656 dnf[34205]: CentOS Stream 9 - BaseOS                         25 kB/s | 6.7 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: CentOS Stream 9 - AppStream                      67 kB/s | 6.8 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: CentOS Stream 9 - CRB                            70 kB/s | 6.5 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: CentOS Stream 9 - Extras packages                79 kB/s | 8.3 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: dlrn-antelope-testing                           175 kB/s | 3.0 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: dlrn-antelope-build-deps                        180 kB/s | 3.0 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: centos9-rabbitmq                                113 kB/s | 3.0 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: centos9-storage                                  24 kB/s | 3.0 kB     00:00
Nov 25 13:24:51 np0005535656 dnf[34205]: centos9-opstools                                 93 kB/s | 3.0 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: NFV SIG OpenvSwitch                              78 kB/s | 3.0 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: repo-setup-centos-appstream                      53 kB/s | 4.4 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: repo-setup-centos-baseos                        188 kB/s | 3.9 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: repo-setup-centos-highavailability              186 kB/s | 3.9 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: repo-setup-centos-powertools                    209 kB/s | 4.3 kB     00:00
Nov 25 13:24:52 np0005535656 dnf[34205]: Extra Packages for Enterprise Linux 9 - x86_64   99 kB/s |  34 kB     00:00
Nov 25 13:24:53 np0005535656 dnf[34205]: Metadata cache created.
Nov 25 13:24:53 np0005535656 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 25 13:24:53 np0005535656 systemd[1]: Finished dnf makecache.
Nov 25 13:24:53 np0005535656 systemd[1]: dnf-makecache.service: Consumed 1.724s CPU time.
Nov 25 13:25:53 np0005535656 kernel: SELinux:  Converting 2719 SID table entries...
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:25:53 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:25:54 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 25 13:25:54 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:25:54 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:25:54 np0005535656 systemd[1]: Reloading.
Nov 25 13:25:54 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:25:54 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:25:55 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:25:55 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:25:55 np0005535656 systemd[1]: man-db-cache-update.service: Consumed 1.041s CPU time.
Nov 25 13:25:55 np0005535656 systemd[1]: run-r3f7ef166996f4ba2b2ab229b285d7506.service: Deactivated successfully.
Nov 25 13:25:55 np0005535656 python3.9[35470]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:25:58 np0005535656 python3.9[35751]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 25 13:25:59 np0005535656 python3.9[35903]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 25 13:26:01 np0005535656 python3.9[36056]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:26:02 np0005535656 python3.9[36210]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 25 13:26:03 np0005535656 python3.9[36362]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:26:04 np0005535656 python3.9[36514]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:26:04 np0005535656 python3.9[36637]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095163.999243-457-7525780686498/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:26:06 np0005535656 python3.9[36789]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:26:10 np0005535656 python3.9[36941]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:11 np0005535656 python3.9[37094]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:26:12 np0005535656 python3.9[37246]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 25 13:26:12 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:26:13 np0005535656 python3.9[37400]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:26:14 np0005535656 python3.9[37558]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 13:26:15 np0005535656 python3.9[37718]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 25 13:26:16 np0005535656 python3.9[37871]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:26:17 np0005535656 python3.9[38029]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 25 13:26:18 np0005535656 python3.9[38181]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:26:21 np0005535656 python3.9[38334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:26:22 np0005535656 python3.9[38486]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:26:23 np0005535656 python3.9[38609]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095181.916513-696-185146034915120/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:26:24 np0005535656 python3.9[38761]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:26:24 np0005535656 systemd[1]: Starting Load Kernel Modules...
Nov 25 13:26:24 np0005535656 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 25 13:26:24 np0005535656 kernel: Bridge firewalling registered
Nov 25 13:26:24 np0005535656 systemd-modules-load[38765]: Inserted module 'br_netfilter'
Nov 25 13:26:24 np0005535656 systemd[1]: Finished Load Kernel Modules.
Nov 25 13:26:25 np0005535656 python3.9[38920]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:26:25 np0005535656 python3.9[39043]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095184.6637678-741-37361697661818/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:26:26 np0005535656 python3.9[39195]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:26:29 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:26:29 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:26:29 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:26:29 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:26:30 np0005535656 systemd[1]: Reloading.
Nov 25 13:26:30 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:26:30 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:26:32 np0005535656 python3.9[41265]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:26:33 np0005535656 python3.9[42476]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 25 13:26:33 np0005535656 python3.9[43211]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:26:33 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:26:33 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:26:33 np0005535656 systemd[1]: man-db-cache-update.service: Consumed 4.747s CPU time.
Nov 25 13:26:33 np0005535656 systemd[1]: run-r78cc9fc73ade4455a31dbeb10b16d7b2.service: Deactivated successfully.
Nov 25 13:26:34 np0005535656 python3.9[43364]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:34 np0005535656 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 13:26:35 np0005535656 systemd[1]: Starting Authorization Manager...
Nov 25 13:26:35 np0005535656 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 13:26:35 np0005535656 polkitd[43581]: Started polkitd version 0.117
Nov 25 13:26:35 np0005535656 systemd[1]: Started Authorization Manager.
Nov 25 13:26:37 np0005535656 python3.9[43751]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:26:37 np0005535656 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 25 13:26:37 np0005535656 systemd[1]: tuned.service: Deactivated successfully.
Nov 25 13:26:37 np0005535656 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 25 13:26:37 np0005535656 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 13:26:37 np0005535656 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 13:26:38 np0005535656 python3.9[43913]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 25 13:26:41 np0005535656 python3.9[44065]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:26:41 np0005535656 systemd[1]: Reloading.
Nov 25 13:26:41 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:26:42 np0005535656 python3.9[44254]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:26:42 np0005535656 systemd[1]: Reloading.
Nov 25 13:26:42 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:26:44 np0005535656 python3.9[44443]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:45 np0005535656 python3.9[44596]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:45 np0005535656 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 25 13:26:46 np0005535656 python3.9[44749]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:48 np0005535656 python3.9[44911]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:26:49 np0005535656 python3.9[45064]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:26:49 np0005535656 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 13:26:49 np0005535656 systemd[1]: Stopped Apply Kernel Variables.
Nov 25 13:26:49 np0005535656 systemd[1]: Stopping Apply Kernel Variables...
Nov 25 13:26:49 np0005535656 systemd[1]: Starting Apply Kernel Variables...
Nov 25 13:26:49 np0005535656 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 25 13:26:49 np0005535656 systemd[1]: Finished Apply Kernel Variables.
Nov 25 13:26:50 np0005535656 systemd[1]: session-11.scope: Deactivated successfully.
Nov 25 13:26:50 np0005535656 systemd[1]: session-11.scope: Consumed 2min 9.502s CPU time.
Nov 25 13:26:50 np0005535656 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Nov 25 13:26:50 np0005535656 systemd-logind[788]: Removed session 11.
Nov 25 13:26:55 np0005535656 systemd-logind[788]: New session 12 of user zuul.
Nov 25 13:26:55 np0005535656 systemd[1]: Started Session 12 of User zuul.
Nov 25 13:26:56 np0005535656 python3.9[45247]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:26:58 np0005535656 python3.9[45401]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:26:59 np0005535656 python3.9[45557]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:27:00 np0005535656 python3.9[45708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:27:01 np0005535656 python3.9[45864]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:27:02 np0005535656 python3.9[45948]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:27:04 np0005535656 python3.9[46101]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:27:05 np0005535656 python3.9[46272]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:27:06 np0005535656 python3.9[46424]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:27:06 np0005535656 systemd[1]: var-lib-containers-storage-overlay-compat1961637766-merged.mount: Deactivated successfully.
Nov 25 13:27:06 np0005535656 podman[46425]: 2025-11-25 18:27:06.801740384 +0000 UTC m=+0.042568288 system refresh
Nov 25 13:27:07 np0005535656 python3.9[46587]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:27:07 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:27:08 np0005535656 python3.9[46710]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095227.0756357-204-37662669790048/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b5b2d8387ae7d099a6b8fc6c2b7c3da5a7d26697 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:27:09 np0005535656 python3.9[46862]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:27:09 np0005535656 python3.9[46985]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095228.6419964-234-223303974765622/.source.conf follow=False _original_basename=registries.conf.j2 checksum=f95551851a3aad1fadf39ba40ad5808b10502fe1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:27:10 np0005535656 python3.9[47137]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:27:11 np0005535656 python3.9[47289]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:27:12 np0005535656 python3.9[47441]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:27:12 np0005535656 python3.9[47593]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:27:13 np0005535656 python3.9[47743]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:27:14 np0005535656 python3.9[47897]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:16 np0005535656 python3.9[48050]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:20 np0005535656 python3.9[48210]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:22 np0005535656 python3.9[48363]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:24 np0005535656 python3.9[48516]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:27 np0005535656 python3.9[48672]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:30 np0005535656 python3.9[48841]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:33 np0005535656 python3.9[48994]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:47 np0005535656 python3.9[49331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:27:50 np0005535656 python3.9[49487]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:27:51 np0005535656 python3.9[49662]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:27:51 np0005535656 python3.9[49785]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764095270.589897-530-23384550035276/.source.json _original_basename=.1oqku6ii follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:27:52 np0005535656 python3.9[49937]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:27:52 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:27:54 np0005535656 systemd[1]: var-lib-containers-storage-overlay-compat1576949263-lower\x2dmapped.mount: Deactivated successfully.
Nov 25 13:27:58 np0005535656 podman[49950]: 2025-11-25 18:27:58.028125091 +0000 UTC m=+5.126034975 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 25 13:27:58 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:27:58 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:27:58 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:27:59 np0005535656 python3.9[50250]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:27:59 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:08 np0005535656 podman[50262]: 2025-11-25 18:28:08.871966628 +0000 UTC m=+9.519027953 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:28:08 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:08 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:08 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:12 np0005535656 python3.9[50583]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:28:12 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:13 np0005535656 podman[50595]: 2025-11-25 18:28:13.389634925 +0000 UTC m=+1.189127249 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 25 13:28:13 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:13 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:13 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:14 np0005535656 python3.9[50828]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:28:14 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:24 np0005535656 podman[50842]: 2025-11-25 18:28:24.51428142 +0000 UTC m=+9.536359999 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 25 13:28:24 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:24 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:24 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:28 np0005535656 python3.9[51116]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:28:28 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:31 np0005535656 podman[51129]: 2025-11-25 18:28:31.987127445 +0000 UTC m=+3.892852799 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 25 13:28:31 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:32 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:32 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:32 np0005535656 python3.9[51386]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 13:28:33 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:34 np0005535656 podman[51398]: 2025-11-25 18:28:34.143802109 +0000 UTC m=+1.110737236 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 25 13:28:34 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:34 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:34 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:28:35 np0005535656 systemd[1]: session-12.scope: Deactivated successfully.
Nov 25 13:28:35 np0005535656 systemd[1]: session-12.scope: Consumed 1min 51.037s CPU time.
Nov 25 13:28:35 np0005535656 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Nov 25 13:28:35 np0005535656 systemd-logind[788]: Removed session 12.
Nov 25 13:28:40 np0005535656 systemd-logind[788]: New session 13 of user zuul.
Nov 25 13:28:40 np0005535656 systemd[1]: Started Session 13 of User zuul.
Nov 25 13:28:41 np0005535656 python3.9[51701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:28:43 np0005535656 python3.9[51857]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 25 13:28:44 np0005535656 python3.9[52010]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:28:45 np0005535656 python3.9[52168]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 13:28:46 np0005535656 python3.9[52328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:28:47 np0005535656 python3.9[52412]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:28:50 np0005535656 python3.9[52573]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:29:03 np0005535656 kernel: SELinux:  Converting 2732 SID table entries...
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:29:03 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:29:04 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 25 13:29:04 np0005535656 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 25 13:29:05 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:29:05 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:29:05 np0005535656 systemd[1]: Reloading.
Nov 25 13:29:05 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:29:05 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:29:05 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:29:06 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:29:06 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:29:06 np0005535656 systemd[1]: run-rb36ba58493164afd9e0bc41bf1f2d884.service: Deactivated successfully.
Nov 25 13:29:08 np0005535656 python3.9[53673]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:29:08 np0005535656 systemd[1]: Reloading.
Nov 25 13:29:08 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:29:08 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:29:09 np0005535656 systemd[1]: Starting Open vSwitch Database Unit...
Nov 25 13:29:09 np0005535656 chown[53715]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 25 13:29:09 np0005535656 ovs-ctl[53720]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 25 13:29:09 np0005535656 ovs-ctl[53720]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-ctl[53720]: Starting ovsdb-server [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-vsctl[53769]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 25 13:29:09 np0005535656 ovs-vsctl[53789]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"0dba517c-b8b5-44c5-b9d2-340b509da9f7\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 25 13:29:09 np0005535656 ovs-ctl[53720]: Configuring Open vSwitch system IDs [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-ctl[53720]: Enabling remote OVSDB managers [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-vsctl[53795]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 25 13:29:09 np0005535656 systemd[1]: Started Open vSwitch Database Unit.
Nov 25 13:29:09 np0005535656 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 25 13:29:09 np0005535656 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 25 13:29:09 np0005535656 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 25 13:29:09 np0005535656 kernel: openvswitch: Open vSwitch switching datapath
Nov 25 13:29:09 np0005535656 ovs-ctl[53840]: Inserting openvswitch module [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-ctl[53809]: Starting ovs-vswitchd [  OK  ]
Nov 25 13:29:09 np0005535656 ovs-vsctl[53857]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 25 13:29:09 np0005535656 ovs-ctl[53809]: Enabling remote OVSDB managers [  OK  ]
Nov 25 13:29:09 np0005535656 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 25 13:29:09 np0005535656 systemd[1]: Starting Open vSwitch...
Nov 25 13:29:09 np0005535656 systemd[1]: Finished Open vSwitch.
Nov 25 13:29:10 np0005535656 python3.9[54009]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:29:11 np0005535656 python3.9[54161]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 25 13:29:13 np0005535656 kernel: SELinux:  Converting 2746 SID table entries...
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:29:13 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:29:14 np0005535656 python3.9[54316]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:29:15 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 25 13:29:15 np0005535656 python3.9[54474]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:29:17 np0005535656 python3.9[54627]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:29:19 np0005535656 python3.9[54914]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 13:29:20 np0005535656 python3.9[55064]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:29:21 np0005535656 python3.9[55218]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:29:23 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:29:23 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:29:23 np0005535656 systemd[1]: Reloading.
Nov 25 13:29:23 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:29:23 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:29:23 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:29:23 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:29:23 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:29:23 np0005535656 systemd[1]: run-r2cfe3e0aed054a6f8d00754c69e54d60.service: Deactivated successfully.
Nov 25 13:29:24 np0005535656 python3.9[55536]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:29:24 np0005535656 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 13:29:24 np0005535656 systemd[1]: Stopped Network Manager Wait Online.
Nov 25 13:29:24 np0005535656 systemd[1]: Stopping Network Manager Wait Online...
Nov 25 13:29:24 np0005535656 systemd[1]: Stopping Network Manager...
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.6964] caught SIGTERM, shutting down normally.
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.6985] dhcp4 (eth0): canceled DHCP transaction
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.6985] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.6985] dhcp4 (eth0): state changed no lease
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.6988] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 13:29:24 np0005535656 NetworkManager[7191]: <info>  [1764095364.7071] exiting (success)
Nov 25 13:29:24 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 13:29:24 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 13:29:24 np0005535656 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 13:29:24 np0005535656 systemd[1]: Stopped Network Manager.
Nov 25 13:29:24 np0005535656 systemd[1]: NetworkManager.service: Consumed 13.607s CPU time, 4.1M memory peak, read 0B from disk, written 20.5K to disk.
Nov 25 13:29:24 np0005535656 systemd[1]: Starting Network Manager...
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.8062] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:95ac565f-f1e9-49d2-ac3b-e18fc9a1498e)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.8064] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.8131] manager[0x559e77f15090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 13:29:24 np0005535656 systemd[1]: Starting Hostname Service...
Nov 25 13:29:24 np0005535656 systemd[1]: Started Hostname Service.
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9289] hostname: hostname: using hostnamed
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9290] hostname: static hostname changed from (none) to "compute-1"
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9297] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9302] manager[0x559e77f15090]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9303] manager[0x559e77f15090]: rfkill: WWAN hardware radio set enabled
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9326] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9336] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9337] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9337] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9338] manager: Networking is enabled by state file
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9341] settings: Loaded settings plugin: keyfile (internal)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9345] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9372] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9381] dhcp: init: Using DHCP client 'internal'
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9384] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9390] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9396] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9404] device (lo): Activation: starting connection 'lo' (ede43a25-bba5-487a-91f8-9e1a444321cb)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9411] device (eth0): carrier: link connected
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9418] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9423] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9424] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9432] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9440] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9447] device (eth1): carrier: link connected
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9451] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9456] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (4510c44e-1ef2-583b-8bc0-90ffe3477715) (indicated)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9457] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9462] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9469] device (eth1): Activation: starting connection 'ci-private-network' (4510c44e-1ef2-583b-8bc0-90ffe3477715)
Nov 25 13:29:24 np0005535656 systemd[1]: Started Network Manager.
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9475] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9482] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9486] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9488] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9490] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9492] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9494] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9496] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9499] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9507] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9509] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9518] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9529] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9538] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9539] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9543] device (lo): Activation: successful, device activated.
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9563] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9564] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9566] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9569] device (eth1): Activation: successful, device activated.
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9574] dhcp4 (eth0): state changed new lease, address=38.102.83.203
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9579] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9662] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 systemd[1]: Starting Network Manager Wait Online...
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9684] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9685] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9688] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9690] device (eth0): Activation: successful, device activated.
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9693] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 13:29:24 np0005535656 NetworkManager[55548]: <info>  [1764095364.9694] manager: startup complete
Nov 25 13:29:24 np0005535656 systemd[1]: Finished Network Manager Wait Online.
Nov 25 13:29:25 np0005535656 python3.9[55762]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:29:30 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:29:30 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:29:30 np0005535656 systemd[1]: Reloading.
Nov 25 13:29:30 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:29:30 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:29:30 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:29:31 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:29:31 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:29:31 np0005535656 systemd[1]: run-r837d3e8ad67c4ef490be44e6ebda0f2f.service: Deactivated successfully.
Nov 25 13:29:32 np0005535656 python3.9[56221]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:29:33 np0005535656 python3.9[56373]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:35 np0005535656 python3.9[56527]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:35 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 13:29:35 np0005535656 python3.9[56679]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:36 np0005535656 python3.9[56831]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:37 np0005535656 python3.9[56983]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:38 np0005535656 python3.9[57135]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:29:38 np0005535656 python3.9[57258]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095377.5381846-444-61367208877829/.source _original_basename=.811nti1w follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:39 np0005535656 python3.9[57410]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:40 np0005535656 python3.9[57562]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 25 13:29:41 np0005535656 python3.9[57714]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:44 np0005535656 python3.9[58141]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 25 13:29:45 np0005535656 ansible-async_wrapper.py[58316]: Invoked with j234244749303 300 /home/zuul/.ansible/tmp/ansible-tmp-1764095384.4076474-576-276345953869015/AnsiballZ_edpm_os_net_config.py _
Nov 25 13:29:45 np0005535656 ansible-async_wrapper.py[58319]: Starting module and watcher
Nov 25 13:29:45 np0005535656 ansible-async_wrapper.py[58319]: Start watching 58320 (300)
Nov 25 13:29:45 np0005535656 ansible-async_wrapper.py[58320]: Start module (58320)
Nov 25 13:29:45 np0005535656 ansible-async_wrapper.py[58316]: Return async_wrapper task started.
Nov 25 13:29:45 np0005535656 python3.9[58321]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 25 13:29:46 np0005535656 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 25 13:29:46 np0005535656 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 25 13:29:46 np0005535656 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 25 13:29:46 np0005535656 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 25 13:29:46 np0005535656 kernel: cfg80211: failed to load regulatory.db
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1176] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1192] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1585] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1586] audit: op="connection-add" uuid="370d7f3e-e952-4007-886e-7e976b7a207b" name="br-ex-br" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1601] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1602] audit: op="connection-add" uuid="494216f0-9b09-49e6-a80e-0a4241ae7799" name="br-ex-port" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1614] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1615] audit: op="connection-add" uuid="f1bcacf3-6723-465b-9531-5a90f3eb83aa" name="eth1-port" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1627] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1628] audit: op="connection-add" uuid="47940dc4-c534-4af2-9a01-d2cdb6da1d81" name="vlan20-port" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1638] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1638] audit: op="connection-add" uuid="fc6c88ae-0910-4dbf-b7b6-4b3523f298a9" name="vlan21-port" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1648] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1648] audit: op="connection-add" uuid="ae38b4ea-9b77-46fc-afa0-49946161bdc0" name="vlan22-port" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1666] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1681] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1682] audit: op="connection-add" uuid="43b63386-16b3-4b9a-95be-c69d0a0f729d" name="br-ex-if" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1777] audit: op="connection-update" uuid="4510c44e-1ef2-583b-8bc0-90ffe3477715" name="ci-private-network" args="connection.port-type,connection.slave-type,connection.timestamp,connection.master,connection.controller,ipv4.routes,ipv4.never-default,ipv4.routing-rules,ipv4.addresses,ipv4.dns,ipv4.method,ipv6.routes,ipv6.method,ipv6.routing-rules,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ovs-external-ids.data,ovs-interface.type" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1794] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1795] audit: op="connection-add" uuid="f6e30d86-c398-414d-a250-addb943db2dd" name="vlan20-if" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1808] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1808] audit: op="connection-add" uuid="44e4b20f-06fb-489d-bff4-94a2c87c3c9e" name="vlan21-if" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1823] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1824] audit: op="connection-add" uuid="05200868-e587-4903-911c-7409b247cf7d" name="vlan22-if" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1833] audit: op="connection-delete" uuid="d1311955-ac5a-3c9f-94f4-bbb7d8b3ac9e" name="Wired connection 1" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1842] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1849] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1852] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (370d7f3e-e952-4007-886e-7e976b7a207b)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1852] audit: op="connection-activate" uuid="370d7f3e-e952-4007-886e-7e976b7a207b" name="br-ex-br" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1853] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1858] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1862] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (494216f0-9b09-49e6-a80e-0a4241ae7799)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1863] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1867] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1869] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (f1bcacf3-6723-465b-9531-5a90f3eb83aa)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1871] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1875] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1878] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (47940dc4-c534-4af2-9a01-d2cdb6da1d81)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1880] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1884] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1887] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fc6c88ae-0910-4dbf-b7b6-4b3523f298a9)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1888] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1892] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1895] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ae38b4ea-9b77-46fc-afa0-49946161bdc0)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1896] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1897] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1898] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1903] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1906] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1909] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (43b63386-16b3-4b9a-95be-c69d0a0f729d)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1910] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1912] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1914] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1915] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1916] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1928] device (eth1): disconnecting for new activation request.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1929] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1932] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1933] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1934] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1936] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1940] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1944] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f6e30d86-c398-414d-a250-addb943db2dd)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1945] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1947] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1948] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1949] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1952] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1956] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1960] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (44e4b20f-06fb-489d-bff4-94a2c87c3c9e)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1960] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1963] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1964] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1965] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1968] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1973] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1978] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (05200868-e587-4903-911c-7409b247cf7d)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1979] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1981] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1983] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1984] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1985] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1996] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.1998] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2001] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2002] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2008] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2012] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2015] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2018] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2020] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2024] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2028] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2032] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2035] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2041] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2045] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2048] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2050] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2055] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2059] dhcp4 (eth0): canceled DHCP transaction
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2059] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2059] dhcp4 (eth0): state changed no lease
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2060] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2070] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58322 uid=0 result="fail" reason="Device is not activated"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2127] device (eth1): disconnecting for new activation request.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2128] audit: op="connection-activate" uuid="4510c44e-1ef2-583b-8bc0-90ffe3477715" name="ci-private-network" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2319] dhcp4 (eth0): state changed new lease, address=38.102.83.203
Nov 25 13:29:47 np0005535656 kernel: ovs-system: entered promiscuous mode
Nov 25 13:29:47 np0005535656 systemd-udevd[58328]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:29:47 np0005535656 kernel: Timeout policy base is empty
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2388] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2403] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2406] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58322 uid=0 result="success"
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2408] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2428] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 13:29:47 np0005535656 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2606] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 25 13:29:47 np0005535656 kernel: br-ex: entered promiscuous mode
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2773] device (eth1): Activation: starting connection 'ci-private-network' (4510c44e-1ef2-583b-8bc0-90ffe3477715)
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2779] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2786] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2789] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2794] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2798] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2811] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2813] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 kernel: vlan22: entered promiscuous mode
Nov 25 13:29:47 np0005535656 systemd-udevd[58327]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2820] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2821] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2822] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2829] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2836] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2841] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2845] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2849] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2853] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2857] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2860] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2864] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2868] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2871] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2878] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2884] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2888] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 kernel: vlan21: entered promiscuous mode
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2907] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2911] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2918] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2924] device (eth1): Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2937] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2938] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2941] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2946] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.2959] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 kernel: vlan20: entered promiscuous mode
Nov 25 13:29:47 np0005535656 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3041] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3046] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3052] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3058] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3073] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3175] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3177] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3182] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3189] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3209] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3246] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3249] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 13:29:47 np0005535656 NetworkManager[55548]: <info>  [1764095387.3257] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 13:29:48 np0005535656 NetworkManager[55548]: <info>  [1764095388.4434] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58322 uid=0 result="success"
Nov 25 13:29:48 np0005535656 NetworkManager[55548]: <info>  [1764095388.6217] checkpoint[0x559e77eeb950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 25 13:29:48 np0005535656 NetworkManager[55548]: <info>  [1764095388.6219] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58322 uid=0 result="success"
Nov 25 13:29:48 np0005535656 NetworkManager[55548]: <info>  [1764095388.8432] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58322 uid=0 result="success"
Nov 25 13:29:48 np0005535656 NetworkManager[55548]: <info>  [1764095388.8441] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58322 uid=0 result="success"
Nov 25 13:29:49 np0005535656 python3.9[58656]: ansible-ansible.legacy.async_status Invoked with jid=j234244749303.58316 mode=status _async_dir=/root/.ansible_async
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.0172] audit: op="networking-control" arg="global-dns-configuration" pid=58322 uid=0 result="success"
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.0215] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.0243] audit: op="networking-control" arg="global-dns-configuration" pid=58322 uid=0 result="success"
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.0265] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58322 uid=0 result="success"
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.1706] checkpoint[0x559e77eeba20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 25 13:29:49 np0005535656 NetworkManager[55548]: <info>  [1764095389.1710] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58322 uid=0 result="success"
Nov 25 13:29:49 np0005535656 ansible-async_wrapper.py[58320]: Module complete (58320)
Nov 25 13:29:50 np0005535656 ansible-async_wrapper.py[58319]: Done in kid B.
Nov 25 13:29:52 np0005535656 python3.9[58760]: ansible-ansible.legacy.async_status Invoked with jid=j234244749303.58316 mode=status _async_dir=/root/.ansible_async
Nov 25 13:29:53 np0005535656 python3.9[58860]: ansible-ansible.legacy.async_status Invoked with jid=j234244749303.58316 mode=cleanup _async_dir=/root/.ansible_async
Nov 25 13:29:54 np0005535656 python3.9[59012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:29:54 np0005535656 python3.9[59135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095393.4483848-630-175276602885751/.source.returncode _original_basename=.1hdbaxdd follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:54 np0005535656 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 13:29:55 np0005535656 python3.9[59289]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:29:56 np0005535656 python3.9[59413]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095395.1634784-662-16149231792523/.source.cfg _original_basename=.4ze35__q follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:29:57 np0005535656 python3.9[59565]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:29:57 np0005535656 systemd[1]: Reloading Network Manager...
Nov 25 13:29:57 np0005535656 NetworkManager[55548]: <info>  [1764095397.2445] audit: op="reload" arg="0" pid=59569 uid=0 result="success"
Nov 25 13:29:57 np0005535656 NetworkManager[55548]: <info>  [1764095397.2451] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 25 13:29:57 np0005535656 systemd[1]: Reloaded Network Manager.
Nov 25 13:29:57 np0005535656 systemd[1]: session-13.scope: Deactivated successfully.
Nov 25 13:29:57 np0005535656 systemd[1]: session-13.scope: Consumed 54.042s CPU time.
Nov 25 13:29:57 np0005535656 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Nov 25 13:29:57 np0005535656 systemd-logind[788]: Removed session 13.
Nov 25 13:30:03 np0005535656 systemd-logind[788]: New session 14 of user zuul.
Nov 25 13:30:03 np0005535656 systemd[1]: Started Session 14 of User zuul.
Nov 25 13:30:04 np0005535656 python3.9[59753]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:30:05 np0005535656 python3.9[59907]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:30:06 np0005535656 python3.9[60097]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:30:07 np0005535656 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 13:30:07 np0005535656 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Nov 25 13:30:07 np0005535656 systemd[1]: session-14.scope: Deactivated successfully.
Nov 25 13:30:07 np0005535656 systemd[1]: session-14.scope: Consumed 2.579s CPU time.
Nov 25 13:30:07 np0005535656 systemd-logind[788]: Removed session 14.
Nov 25 13:30:13 np0005535656 systemd-logind[788]: New session 15 of user zuul.
Nov 25 13:30:13 np0005535656 systemd[1]: Started Session 15 of User zuul.
Nov 25 13:30:14 np0005535656 python3.9[60279]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:30:15 np0005535656 python3.9[60434]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:30:16 np0005535656 python3.9[60590]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:30:17 np0005535656 python3.9[60674]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:30:19 np0005535656 python3.9[60828]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:30:21 np0005535656 python3.9[61019]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:30:22 np0005535656 python3.9[61171]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:30:22 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:30:23 np0005535656 python3.9[61334]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:30:23 np0005535656 python3.9[61412]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:30:24 np0005535656 python3.9[61564]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:30:25 np0005535656 python3.9[61642]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:30:26 np0005535656 python3.9[61794]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:30:26 np0005535656 python3.9[61948]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:30:27 np0005535656 python3.9[62100]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:30:28 np0005535656 python3.9[62252]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:30:29 np0005535656 python3.9[62404]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:30:31 np0005535656 python3.9[62557]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:30:32 np0005535656 python3.9[62711]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:30:33 np0005535656 python3.9[62863]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:30:34 np0005535656 python3.9[63015]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:30:35 np0005535656 python3.9[63168]: ansible-service_facts Invoked
Nov 25 13:30:35 np0005535656 network[63185]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:30:35 np0005535656 network[63186]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:30:35 np0005535656 network[63187]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:30:41 np0005535656 python3.9[63639]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:30:44 np0005535656 python3.9[63792]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 25 13:30:45 np0005535656 python3.9[63944]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:30:46 np0005535656 python3.9[64069]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095444.938105-450-112322764022070/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:30:47 np0005535656 python3.9[64223]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:30:47 np0005535656 python3.9[64348]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095446.5946698-481-252454687642225/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:30:49 np0005535656 python3.9[64502]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:30:50 np0005535656 python3.9[64656]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:30:52 np0005535656 python3.9[64740]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:30:53 np0005535656 python3.9[64894]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:30:54 np0005535656 python3.9[64978]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:30:54 np0005535656 chronyd[802]: chronyd exiting
Nov 25 13:30:54 np0005535656 systemd[1]: Stopping NTP client/server...
Nov 25 13:30:54 np0005535656 systemd[1]: chronyd.service: Deactivated successfully.
Nov 25 13:30:54 np0005535656 systemd[1]: Stopped NTP client/server.
Nov 25 13:30:54 np0005535656 systemd[1]: Starting NTP client/server...
Nov 25 13:30:54 np0005535656 chronyd[64986]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 13:30:54 np0005535656 chronyd[64986]: Frequency -26.270 +/- 0.069 ppm read from /var/lib/chrony/drift
Nov 25 13:30:54 np0005535656 chronyd[64986]: Loaded seccomp filter (level 2)
Nov 25 13:30:54 np0005535656 systemd[1]: Started NTP client/server.
Nov 25 13:30:54 np0005535656 systemd[1]: session-15.scope: Deactivated successfully.
Nov 25 13:30:54 np0005535656 systemd[1]: session-15.scope: Consumed 29.637s CPU time.
Nov 25 13:30:54 np0005535656 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Nov 25 13:30:54 np0005535656 systemd-logind[788]: Removed session 15.
Nov 25 13:31:00 np0005535656 systemd-logind[788]: New session 16 of user zuul.
Nov 25 13:31:00 np0005535656 systemd[1]: Started Session 16 of User zuul.
Nov 25 13:31:01 np0005535656 python3.9[65166]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:31:02 np0005535656 python3.9[65322]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:03 np0005535656 python3.9[65497]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:04 np0005535656 python3.9[65575]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.234cbhho recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:05 np0005535656 python3.9[65727]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:06 np0005535656 python3.9[65850]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095464.7713022-108-201417667586165/.source _original_basename=._4r8bo60 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:07 np0005535656 python3.9[66002]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:31:07 np0005535656 python3.9[66154]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:08 np0005535656 python3.9[66277]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095467.4026408-156-100209865277589/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:31:09 np0005535656 python3.9[66429]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:09 np0005535656 python3.9[66552]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095468.7608883-156-87689507095451/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:31:10 np0005535656 python3.9[66704]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:11 np0005535656 python3.9[66856]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:12 np0005535656 python3.9[66979]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095471.0169754-230-110485637682143/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:12 np0005535656 python3.9[67131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:13 np0005535656 python3.9[67254]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095472.406432-260-219667182555631/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:14 np0005535656 python3.9[67406]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:31:14 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:14 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:14 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:15 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:15 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:15 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:15 np0005535656 systemd[1]: Starting EDPM Container Shutdown...
Nov 25 13:31:15 np0005535656 systemd[1]: Finished EDPM Container Shutdown.
Nov 25 13:31:16 np0005535656 python3.9[67633]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:16 np0005535656 python3.9[67756]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095475.5592253-306-233866288865495/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:17 np0005535656 python3.9[67908]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:18 np0005535656 python3.9[68031]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095477.0980828-336-112224688201848/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:19 np0005535656 python3.9[68183]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:31:19 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:19 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:19 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:20 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:20 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:20 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:21 np0005535656 systemd[1]: Starting Create netns directory...
Nov 25 13:31:21 np0005535656 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 13:31:21 np0005535656 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 13:31:21 np0005535656 systemd[1]: Finished Create netns directory.
Nov 25 13:31:22 np0005535656 python3.9[68408]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:31:22 np0005535656 network[68425]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:31:22 np0005535656 network[68426]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:31:22 np0005535656 network[68427]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:31:27 np0005535656 python3.9[68689]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:31:27 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:27 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:27 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:28 np0005535656 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 25 13:31:28 np0005535656 iptables.init[68729]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 25 13:31:28 np0005535656 iptables.init[68729]: iptables: Flushing firewall rules: [  OK  ]
Nov 25 13:31:28 np0005535656 systemd[1]: iptables.service: Deactivated successfully.
Nov 25 13:31:28 np0005535656 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 25 13:31:29 np0005535656 python3.9[68925]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:31:30 np0005535656 python3.9[69079]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:31:30 np0005535656 systemd[1]: Reloading.
Nov 25 13:31:30 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:31:30 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:31:30 np0005535656 systemd[1]: Starting Netfilter Tables...
Nov 25 13:31:30 np0005535656 systemd[1]: Finished Netfilter Tables.
Nov 25 13:31:31 np0005535656 python3.9[69271]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:31:33 np0005535656 python3.9[69424]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:33 np0005535656 python3.9[69549]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095492.4906209-474-135057690349477/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:34 np0005535656 python3.9[69702]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:31:34 np0005535656 systemd[1]: Reloading OpenSSH server daemon...
Nov 25 13:31:34 np0005535656 systemd[1]: Reloaded OpenSSH server daemon.
Nov 25 13:31:35 np0005535656 python3.9[69858]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:36 np0005535656 python3.9[70010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:37 np0005535656 python3.9[70133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095495.918354-536-196545116207899/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:38 np0005535656 python3.9[70285]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 13:31:38 np0005535656 systemd[1]: Starting Time & Date Service...
Nov 25 13:31:38 np0005535656 systemd[1]: Started Time & Date Service.
Nov 25 13:31:39 np0005535656 python3.9[70441]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:40 np0005535656 python3.9[70593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:40 np0005535656 python3.9[70716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095499.6842463-606-140874131245477/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:41 np0005535656 python3.9[70868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:42 np0005535656 python3.9[70991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095501.1351388-636-219231923525880/.source.yaml _original_basename=.rruzztcm follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:43 np0005535656 python3.9[71143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:43 np0005535656 python3.9[71267]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095502.5726666-666-201260985467128/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:44 np0005535656 python3.9[71419]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:31:45 np0005535656 python3.9[71572]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:31:46 np0005535656 python3[71725]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 13:31:47 np0005535656 python3.9[71877]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:48 np0005535656 python3.9[72000]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095507.1156504-745-172418588684791/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:49 np0005535656 python3.9[72152]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:49 np0005535656 python3.9[72277]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095508.622189-775-260596645444667/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:50 np0005535656 python3.9[72429]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:51 np0005535656 python3.9[72552]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095510.2100096-805-136394113986212/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:52 np0005535656 python3.9[72704]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:53 np0005535656 python3.9[72827]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095511.9541621-835-69802950953113/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:54 np0005535656 python3.9[72979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:31:54 np0005535656 python3.9[73102]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095513.6294167-864-196621817047830/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:55 np0005535656 python3.9[73254]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:56 np0005535656 python3.9[73406]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:31:57 np0005535656 python3.9[73565]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:58 np0005535656 python3.9[73718]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:31:59 np0005535656 python3.9[73870]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:00 np0005535656 python3.9[74022]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 13:32:00 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:32:00 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:32:01 np0005535656 python3.9[74176]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 13:32:02 np0005535656 systemd[1]: session-16.scope: Deactivated successfully.
Nov 25 13:32:02 np0005535656 systemd[1]: session-16.scope: Consumed 43.048s CPU time.
Nov 25 13:32:02 np0005535656 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Nov 25 13:32:02 np0005535656 systemd-logind[788]: Removed session 16.
Nov 25 13:32:08 np0005535656 systemd-logind[788]: New session 17 of user zuul.
Nov 25 13:32:08 np0005535656 systemd[1]: Started Session 17 of User zuul.
Nov 25 13:32:08 np0005535656 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 13:32:08 np0005535656 python3.9[74359]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 25 13:32:09 np0005535656 python3.9[74511]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:32:11 np0005535656 python3.9[74663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:32:12 np0005535656 python3.9[74815]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCZZGkUnIVtghWZhNKcNDe/+ydOm1lOBtNF3wzS2NBvnGwtUlt2yhJFyfftxGIJDNDXZy73A67Hbks7vC0inuTODA26f6X24zKr2bEiLNd5HL2S458PM/oFWYqHeM5p9bnLPQbuef697QXC4O51meGHAUtFE+EGufmWsTUiSy1Sx17v28DlxyArFpqSS2MQbm5XbZLls0yUDlkuF9X6alG/XALCGA6fxEwlUDDiJ4U/birqX4e76DuC4vW6SR78Anh+oPYhM/r7dlaedauuWYhBBkg6VztIpeHLIBr3JkkJvSqJP4GAnoIudP3T4FpSXfJgoDYhMQuVZsAazU+e1JTLij0lAOpfRjbpLRroOAYOmvuSiVmjKwQSqHcW4S++ts5TCUDDiFIwmJLMikiRaTDBi4c0cHY1K7iIqZ0T9N6Fe1lvehZ8hOxihG8mVpsdqJps2kykZEKjJZvAZQKTFz4v/xAr+hDuSHM6DBD/WgWKTSZRt4i+YK9QiLfwlRcb6lM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGLkQujfPZ/nWGbS+cTzOTTPk++bMv1Sy5MRYniBgPX/#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBaTcSAC7eBigRtrLfzEoy9vvGP1eWrwSwlY+N9bMd7pG11glV058jec6Jesm9fgP5YArNTRrshDpmHL6YPnMC0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDct3GYZxllqkxXZKcaq+l9N/f3q9Ie62A9syWsBfwJf2nT6ZR3DGS5ZE4TYl7BOAsGN44XdteVlfHamruqL0ktP3KK2NeqL80/jH5SnjsfNQZ69wEjTiGyuRtBsgv03qbf7Hb3PEOFUoFgzzaY121Xdc+HTArtn3bvyb5YZvNS8FWYMZQ3hR553HRxiFCMTZaHWlPWPnrvCzV4cNLLByKKpUAynQYXVSIGN8SghSMLyresmP82bnfJ/lJVLRnhWyO5gq5IqIyePG1ZEnehk6ZVwr8txwwuzDXe6SzmwkMPl92HKk83mRmkUCHD/kfRJzTEVJZBUr6MNEpVMZd0Vi6v9seZnmsiKhe3kr2+oHXCNPkeD2JL+/RZmu2g28JPQgXON5DPBiZszAcB5qrjXYtwIe+wJr4vakM5ruPf7wvoWRFitA2LFPjARvURFw7pV8bHlL/LAjHnYvhsI546IqEPej5smfCBQ6yJUkhEMaEgQNQvGS88EY6WR7fAAsFvCEE=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAHYc6q5zd0HniDb+2+gj7GYi8S1OVxETFdJUcmlFjV5#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBcs/elBT5KCnEaLdSnJTdqVz3NHr7P2EbvzaoG0bglrbI6CQ2kHefibBCZaIo/o8oiVmKbYIH0HLDZSpo8ofv8=#012 create=True mode=0644 path=/tmp/ansible.tc8p2qen state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:13 np0005535656 python3.9[74967]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.tc8p2qen' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:32:14 np0005535656 python3.9[75121]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.tc8p2qen state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:15 np0005535656 systemd[1]: session-17.scope: Deactivated successfully.
Nov 25 13:32:15 np0005535656 systemd[1]: session-17.scope: Consumed 3.814s CPU time.
Nov 25 13:32:15 np0005535656 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Nov 25 13:32:15 np0005535656 systemd-logind[788]: Removed session 17.
Nov 25 13:32:21 np0005535656 systemd-logind[788]: New session 18 of user zuul.
Nov 25 13:32:21 np0005535656 systemd[1]: Started Session 18 of User zuul.
Nov 25 13:32:23 np0005535656 python3.9[75299]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:32:24 np0005535656 python3.9[75455]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 13:32:26 np0005535656 python3.9[75609]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:32:27 np0005535656 python3.9[75762]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:32:28 np0005535656 python3.9[75915]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:32:29 np0005535656 python3.9[76069]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:32:30 np0005535656 python3.9[76224]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:31 np0005535656 systemd[1]: session-18.scope: Deactivated successfully.
Nov 25 13:32:31 np0005535656 systemd[1]: session-18.scope: Consumed 5.355s CPU time.
Nov 25 13:32:31 np0005535656 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Nov 25 13:32:31 np0005535656 systemd-logind[788]: Removed session 18.
Nov 25 13:32:36 np0005535656 systemd-logind[788]: New session 19 of user zuul.
Nov 25 13:32:36 np0005535656 systemd[1]: Started Session 19 of User zuul.
Nov 25 13:32:37 np0005535656 python3.9[76402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:32:38 np0005535656 python3.9[76558]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:32:39 np0005535656 python3.9[76642]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 13:32:42 np0005535656 python3.9[76793]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:32:43 np0005535656 python3.9[76944]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:32:44 np0005535656 python3.9[77094]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:32:45 np0005535656 python3.9[77244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:32:45 np0005535656 systemd[1]: session-19.scope: Deactivated successfully.
Nov 25 13:32:45 np0005535656 systemd[1]: session-19.scope: Consumed 6.678s CPU time.
Nov 25 13:32:45 np0005535656 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Nov 25 13:32:45 np0005535656 systemd-logind[788]: Removed session 19.
Nov 25 13:32:51 np0005535656 systemd-logind[788]: New session 20 of user zuul.
Nov 25 13:32:51 np0005535656 systemd[1]: Started Session 20 of User zuul.
Nov 25 13:32:52 np0005535656 python3.9[77422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:32:54 np0005535656 python3.9[77578]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:32:55 np0005535656 python3.9[77730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:32:56 np0005535656 python3.9[77882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:32:56 np0005535656 python3.9[78005]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095575.402162-114-252337845620218/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=034acf5639bb3890d369e80b8fcd6e5ebe8eec3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:57 np0005535656 python3.9[78157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:32:58 np0005535656 python3.9[78280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095577.1509166-114-143671718913560/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=0a803ccdbdc283fe90f04ca0f5aa7ec801dbb84b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:32:59 np0005535656 python3.9[78432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:32:59 np0005535656 python3.9[78555]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095578.566361-114-74880572747948/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=8e5b919ffd24fd5e34876f7827d70a7eead471cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:00 np0005535656 python3.9[78707]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:01 np0005535656 python3.9[78859]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:02 np0005535656 python3.9[79011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:02 np0005535656 python3.9[79134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095581.7037077-235-191340891522261/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e39fc34f7b1f058c0b5215def31d13ef8882bc3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:03 np0005535656 python3.9[79286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:04 np0005535656 python3.9[79409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095583.1230702-235-73475938971992/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=75798a1b7c64ee6e535d18533584b05e44c8478b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:04 np0005535656 chronyd[64986]: Selected source 216.232.132.102 (pool.ntp.org)
Nov 25 13:33:05 np0005535656 python3.9[79561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:05 np0005535656 python3.9[79684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095584.5068648-235-56659324482821/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a92b6ea0d745a2037d8a8fabb708c184cfd67730 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:06 np0005535656 python3.9[79836]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:07 np0005535656 python3.9[79988]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:08 np0005535656 python3.9[80142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:08 np0005535656 python3.9[80265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095587.581945-352-255137494575337/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=dcc8b7f5700547b3936fbe7be31481951ab8947b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:09 np0005535656 python3.9[80417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:10 np0005535656 python3.9[80540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095588.9526637-352-216182666765845/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=1c34443136315ca10cc789dbaafdfe7e254312b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:10 np0005535656 python3.9[80692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:11 np0005535656 python3.9[80815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095590.3624122-352-250549041514342/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=ae776e35fef07d1aa75b5285c787923d9aa347d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:12 np0005535656 python3.9[80967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:13 np0005535656 python3.9[81119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:14 np0005535656 python3.9[81271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:14 np0005535656 python3.9[81394]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095593.4972456-470-217974034489730/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=abb53bd673c44abe9b40a4bb6e753ac8ddddf35f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:15 np0005535656 python3.9[81546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:16 np0005535656 python3.9[81669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095595.0371816-470-66010935765362/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=1c34443136315ca10cc789dbaafdfe7e254312b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:17 np0005535656 python3.9[81821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:18 np0005535656 python3.9[81944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095596.5487921-470-206794987023452/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=dab5bf9681aa03d35f991ec2b541f70c3cf4df5b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:19 np0005535656 python3.9[82096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:20 np0005535656 python3.9[82248]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:21 np0005535656 python3.9[82371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095599.9430306-614-145623803153078/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:21 np0005535656 python3.9[82523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:22 np0005535656 python3.9[82675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:23 np0005535656 python3.9[82798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095602.1250093-660-18231421992470/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:24 np0005535656 python3.9[82950]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:24 np0005535656 python3.9[83102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:25 np0005535656 python3.9[83225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095604.3846743-708-42888345957594/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:26 np0005535656 python3.9[83377]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:27 np0005535656 python3.9[83529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:28 np0005535656 python3.9[83652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095606.842742-759-81709558530927/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:29 np0005535656 python3.9[83804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:29 np0005535656 python3.9[83956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:30 np0005535656 python3.9[84079]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095609.2246914-810-164773976033437/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:31 np0005535656 python3.9[84231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:32 np0005535656 python3.9[84383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:32 np0005535656 python3.9[84506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095611.5141501-859-141209497176727/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:33 np0005535656 python3.9[84658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:34 np0005535656 python3.9[84810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:35 np0005535656 python3.9[84933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095613.7501187-907-278566904804313/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=aaea985d0cb19289b719169c437f65790c5644d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:35 np0005535656 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Nov 25 13:33:35 np0005535656 systemd[1]: session-20.scope: Deactivated successfully.
Nov 25 13:33:35 np0005535656 systemd[1]: session-20.scope: Consumed 35.031s CPU time.
Nov 25 13:33:35 np0005535656 systemd-logind[788]: Removed session 20.
Nov 25 13:33:41 np0005535656 systemd-logind[788]: New session 21 of user zuul.
Nov 25 13:33:41 np0005535656 systemd[1]: Started Session 21 of User zuul.
Nov 25 13:33:42 np0005535656 python3.9[85111]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:33:43 np0005535656 python3.9[85267]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:44 np0005535656 python3.9[85419]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:33:45 np0005535656 python3.9[85569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:33:46 np0005535656 python3.9[85721]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 13:33:48 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 25 13:33:48 np0005535656 python3.9[85877]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:33:49 np0005535656 python3.9[85961]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:33:52 np0005535656 python3.9[86114]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:33:53 np0005535656 python3[86269]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 25 13:33:54 np0005535656 python3.9[86421]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:55 np0005535656 python3.9[86573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:56 np0005535656 python3.9[86651]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:57 np0005535656 python3.9[86803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:57 np0005535656 python3.9[86881]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ofs3rj8l recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:33:58 np0005535656 python3.9[87033]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:33:59 np0005535656 python3.9[87111]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:00 np0005535656 python3.9[87263]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:01 np0005535656 python3[87416]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 13:34:02 np0005535656 python3.9[87568]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:02 np0005535656 python3.9[87693]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095641.4316514-300-156272754969362/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:03 np0005535656 python3.9[87845]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:04 np0005535656 python3.9[87970]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095643.13345-330-80980503520027/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:05 np0005535656 python3.9[88122]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:06 np0005535656 python3.9[88247]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095645.1170888-360-92038438599120/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:07 np0005535656 python3.9[88399]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:08 np0005535656 python3.9[88524]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095646.7334077-390-131458095274533/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:08 np0005535656 python3.9[88676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:09 np0005535656 python3.9[88801]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095648.32993-420-259691749228625/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:10 np0005535656 python3.9[88953]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:11 np0005535656 python3.9[89105]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:12 np0005535656 python3.9[89260]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:13 np0005535656 python3.9[89412]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:14 np0005535656 python3.9[89565]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:34:15 np0005535656 python3.9[89719]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:15 np0005535656 python3.9[89874]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:17 np0005535656 python3.9[90024]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:34:18 np0005535656 python3.9[90178]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:18 np0005535656 ovs-vsctl[90179]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 25 13:34:19 np0005535656 python3.9[90331]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:20 np0005535656 python3.9[90486]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:20 np0005535656 ovs-vsctl[90487]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 25 13:34:21 np0005535656 python3.9[90637]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:34:22 np0005535656 python3.9[90791]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:23 np0005535656 python3.9[90943]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:23 np0005535656 python3.9[91021]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:24 np0005535656 python3.9[91173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:24 np0005535656 python3.9[91251]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:25 np0005535656 python3.9[91403]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:26 np0005535656 python3.9[91555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:27 np0005535656 python3.9[91633]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:28 np0005535656 python3.9[91785]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:28 np0005535656 python3.9[91863]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:29 np0005535656 python3.9[92016]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:34:29 np0005535656 systemd[1]: Reloading.
Nov 25 13:34:29 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:34:29 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:34:30 np0005535656 python3.9[92205]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:31 np0005535656 python3.9[92285]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:32 np0005535656 python3.9[92437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:32 np0005535656 python3.9[92515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:34 np0005535656 python3.9[92667]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:34:34 np0005535656 systemd[1]: Reloading.
Nov 25 13:34:34 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:34:34 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:34:34 np0005535656 systemd[1]: Starting Create netns directory...
Nov 25 13:34:34 np0005535656 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 13:34:34 np0005535656 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 13:34:34 np0005535656 systemd[1]: Finished Create netns directory.
Nov 25 13:34:35 np0005535656 python3.9[92862]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:36 np0005535656 python3.9[93014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:37 np0005535656 python3.9[93137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095675.7504656-922-63812287576198/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:38 np0005535656 python3.9[93289]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:34:38 np0005535656 python3.9[93441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:34:39 np0005535656 python3.9[93564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095678.3714535-972-65031154598673/.source.json _original_basename=.xwisyaaq follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:40 np0005535656 python3.9[93716]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:43 np0005535656 python3.9[94143]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 25 13:34:44 np0005535656 python3.9[94295]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:34:45 np0005535656 python3.9[94447]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 13:34:45 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:34:47 np0005535656 python3[94611]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:34:47 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:34:47 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:34:47 np0005535656 podman[94646]: 2025-11-25 18:34:47.450964113 +0000 UTC m=+0.054722696 container create b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:34:47 np0005535656 podman[94646]: 2025-11-25 18:34:47.422193056 +0000 UTC m=+0.025951689 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 25 13:34:47 np0005535656 python3[94611]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 25 13:34:48 np0005535656 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 13:34:48 np0005535656 python3.9[94836]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:34:49 np0005535656 python3.9[94990]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:50 np0005535656 python3.9[95066]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:34:50 np0005535656 python3.9[95217]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764095690.152941-1148-172517918135494/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:34:51 np0005535656 python3.9[95293]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:34:51 np0005535656 systemd[1]: Reloading.
Nov 25 13:34:51 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:34:51 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:34:52 np0005535656 python3.9[95404]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:34:52 np0005535656 systemd[1]: Reloading.
Nov 25 13:34:52 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:34:52 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:34:52 np0005535656 systemd[1]: Starting ovn_controller container...
Nov 25 13:34:53 np0005535656 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 25 13:34:53 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:34:53 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bba02ac1236cc040ffa835302330f73216635836141daee7d79e455bd3aa2407/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 13:34:53 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff.
Nov 25 13:34:53 np0005535656 podman[95445]: 2025-11-25 18:34:53.123155602 +0000 UTC m=+0.180748532 container init b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + sudo -E kolla_set_configs
Nov 25 13:34:53 np0005535656 podman[95445]: 2025-11-25 18:34:53.147797413 +0000 UTC m=+0.205390333 container start b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 13:34:53 np0005535656 edpm-start-podman-container[95445]: ovn_controller
Nov 25 13:34:53 np0005535656 systemd[1]: Created slice User Slice of UID 0.
Nov 25 13:34:53 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 25 13:34:53 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 25 13:34:53 np0005535656 systemd[1]: Starting User Manager for UID 0...
Nov 25 13:34:53 np0005535656 edpm-start-podman-container[95444]: Creating additional drop-in dependency for "ovn_controller" (b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff)
Nov 25 13:34:53 np0005535656 podman[95466]: 2025-11-25 18:34:53.29392353 +0000 UTC m=+0.124599798 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 25 13:34:53 np0005535656 systemd[1]: b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff-3c4e1358886b537c.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 13:34:53 np0005535656 systemd[1]: b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff-3c4e1358886b537c.service: Failed with result 'exit-code'.
Nov 25 13:34:53 np0005535656 systemd[1]: Reloading.
Nov 25 13:34:53 np0005535656 systemd[95493]: Queued start job for default target Main User Target.
Nov 25 13:34:53 np0005535656 systemd[95493]: Created slice User Application Slice.
Nov 25 13:34:53 np0005535656 systemd[95493]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 25 13:34:53 np0005535656 systemd[95493]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 13:34:53 np0005535656 systemd[95493]: Reached target Paths.
Nov 25 13:34:53 np0005535656 systemd[95493]: Reached target Timers.
Nov 25 13:34:53 np0005535656 systemd[95493]: Starting D-Bus User Message Bus Socket...
Nov 25 13:34:53 np0005535656 systemd[95493]: Starting Create User's Volatile Files and Directories...
Nov 25 13:34:53 np0005535656 systemd[95493]: Finished Create User's Volatile Files and Directories.
Nov 25 13:34:53 np0005535656 systemd[95493]: Listening on D-Bus User Message Bus Socket.
Nov 25 13:34:53 np0005535656 systemd[95493]: Reached target Sockets.
Nov 25 13:34:53 np0005535656 systemd[95493]: Reached target Basic System.
Nov 25 13:34:53 np0005535656 systemd[95493]: Reached target Main User Target.
Nov 25 13:34:53 np0005535656 systemd[95493]: Startup finished in 145ms.
Nov 25 13:34:53 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:34:53 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:34:53 np0005535656 systemd[1]: Started User Manager for UID 0.
Nov 25 13:34:53 np0005535656 systemd[1]: Started ovn_controller container.
Nov 25 13:34:53 np0005535656 systemd[1]: Started Session c1 of User root.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: INFO:__main__:Validating config file
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: INFO:__main__:Writing out command to execute
Nov 25 13:34:53 np0005535656 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: ++ cat /run_command
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + ARGS=
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + sudo kolla_copy_cacerts
Nov 25 13:34:53 np0005535656 systemd[1]: Started Session c2 of User root.
Nov 25 13:34:53 np0005535656 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + [[ ! -n '' ]]
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + . kolla_extend_start
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + umask 0022
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8112] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8123] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8137] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8143] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8147] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 25 13:34:53 np0005535656 kernel: br-int: entered promiscuous mode
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 13:34:53 np0005535656 ovn_controller[95460]: 2025-11-25T18:34:53Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8328] manager: (ovn-b2fe70-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8337] manager: (ovn-e972f2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 25 13:34:53 np0005535656 kernel: genev_sys_6081: entered promiscuous mode
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8566] device (genev_sys_6081): carrier: link connected
Nov 25 13:34:53 np0005535656 NetworkManager[55548]: <info>  [1764095693.8570] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 25 13:34:53 np0005535656 systemd-udevd[95613]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:34:53 np0005535656 systemd-udevd[95618]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:34:54 np0005535656 python3.9[95727]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:54 np0005535656 ovs-vsctl[95728]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 25 13:34:55 np0005535656 python3.9[95880]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:55 np0005535656 ovs-vsctl[95882]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 25 13:34:56 np0005535656 python3.9[96035]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:34:56 np0005535656 ovs-vsctl[96036]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 25 13:34:57 np0005535656 systemd[1]: session-21.scope: Deactivated successfully.
Nov 25 13:34:57 np0005535656 systemd[1]: session-21.scope: Consumed 56.730s CPU time.
Nov 25 13:34:57 np0005535656 systemd-logind[788]: Session 21 logged out. Waiting for processes to exit.
Nov 25 13:34:57 np0005535656 systemd-logind[788]: Removed session 21.
Nov 25 13:35:03 np0005535656 systemd-logind[788]: New session 23 of user zuul.
Nov 25 13:35:03 np0005535656 systemd[1]: Started Session 23 of User zuul.
Nov 25 13:35:03 np0005535656 systemd[1]: Stopping User Manager for UID 0...
Nov 25 13:35:03 np0005535656 systemd[95493]: Activating special unit Exit the Session...
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped target Main User Target.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped target Basic System.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped target Paths.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped target Sockets.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped target Timers.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 13:35:03 np0005535656 systemd[95493]: Closed D-Bus User Message Bus Socket.
Nov 25 13:35:03 np0005535656 systemd[95493]: Stopped Create User's Volatile Files and Directories.
Nov 25 13:35:03 np0005535656 systemd[95493]: Removed slice User Application Slice.
Nov 25 13:35:03 np0005535656 systemd[95493]: Reached target Shutdown.
Nov 25 13:35:03 np0005535656 systemd[95493]: Finished Exit the Session.
Nov 25 13:35:03 np0005535656 systemd[95493]: Reached target Exit the Session.
Nov 25 13:35:03 np0005535656 systemd[1]: user@0.service: Deactivated successfully.
Nov 25 13:35:03 np0005535656 systemd[1]: Stopped User Manager for UID 0.
Nov 25 13:35:03 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 25 13:35:03 np0005535656 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 25 13:35:03 np0005535656 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 25 13:35:03 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 25 13:35:03 np0005535656 systemd[1]: Removed slice User Slice of UID 0.
Nov 25 13:35:04 np0005535656 python3.9[96216]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:35:05 np0005535656 python3.9[96372]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:06 np0005535656 python3.9[96524]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:07 np0005535656 python3.9[96676]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:08 np0005535656 python3.9[96828]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:09 np0005535656 python3.9[96980]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:09 np0005535656 python3.9[97130]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:35:11 np0005535656 python3.9[97282]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 13:35:12 np0005535656 python3.9[97433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:13 np0005535656 python3.9[97554]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095711.9002962-158-70563820268245/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:14 np0005535656 python3.9[97704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:14 np0005535656 python3.9[97825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095713.6881425-188-134278874979515/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:15 np0005535656 python3.9[97977]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:35:16 np0005535656 python3.9[98061]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:35:19 np0005535656 python3.9[98214]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:35:20 np0005535656 python3.9[98367]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:21 np0005535656 python3.9[98488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095719.8833058-262-144411998001009/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:21 np0005535656 python3.9[98638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:22 np0005535656 python3.9[98759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095721.266189-262-12204458085737/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:23 np0005535656 ovn_controller[95460]: 2025-11-25T18:35:23Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Nov 25 13:35:23 np0005535656 ovn_controller[95460]: 2025-11-25T18:35:23Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 25 13:35:23 np0005535656 podman[98883]: 2025-11-25 18:35:23.613644421 +0000 UTC m=+0.123696590 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 13:35:23 np0005535656 python3.9[98920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:24 np0005535656 python3.9[99054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095723.2084794-350-109949337506878/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:25 np0005535656 python3.9[99204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:25 np0005535656 python3.9[99325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095724.6151164-350-185194355410626/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:26 np0005535656 python3.9[99475]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:35:27 np0005535656 python3.9[99629]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:28 np0005535656 python3.9[99781]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:28 np0005535656 python3.9[99859]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:29 np0005535656 python3.9[100011]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:30 np0005535656 python3.9[100089]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:30 np0005535656 python3.9[100241]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:31 np0005535656 python3.9[100393]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:32 np0005535656 python3.9[100471]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:32 np0005535656 python3.9[100623]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:33 np0005535656 python3.9[100701]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:34 np0005535656 python3.9[100853]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:35:34 np0005535656 systemd[1]: Reloading.
Nov 25 13:35:34 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:35:34 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:35:35 np0005535656 python3.9[101042]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:35 np0005535656 python3.9[101120]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:36 np0005535656 python3.9[101272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:37 np0005535656 python3.9[101350]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:38 np0005535656 python3.9[101502]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:35:38 np0005535656 systemd[1]: Reloading.
Nov 25 13:35:38 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:35:38 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:35:38 np0005535656 systemd[1]: Starting Create netns directory...
Nov 25 13:35:38 np0005535656 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 13:35:38 np0005535656 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 13:35:38 np0005535656 systemd[1]: Finished Create netns directory.
Nov 25 13:35:39 np0005535656 python3.9[101696]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:40 np0005535656 python3.9[101848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:41 np0005535656 python3.9[101971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095739.8285673-652-184861122537819/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:42 np0005535656 python3.9[102123]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:35:42 np0005535656 python3.9[102275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:35:43 np0005535656 python3.9[102398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095742.363441-702-17195335316223/.source.json _original_basename=.bazdmh_q follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:44 np0005535656 python3.9[102550]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:46 np0005535656 python3.9[102977]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 25 13:35:47 np0005535656 python3.9[103129]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:35:48 np0005535656 python3.9[103281]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 13:35:50 np0005535656 python3[103459]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:35:51 np0005535656 podman[103495]: 2025-11-25 18:35:51.080795278 +0000 UTC m=+0.054077674 container create e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 25 13:35:51 np0005535656 podman[103495]: 2025-11-25 18:35:51.054232164 +0000 UTC m=+0.027514600 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:35:51 np0005535656 python3[103459]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:35:52 np0005535656 python3.9[103685]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:35:52 np0005535656 python3.9[103841]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:53 np0005535656 python3.9[103917]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:35:54 np0005535656 podman[103998]: 2025-11-25 18:35:54.098608552 +0000 UTC m=+0.203780450 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 25 13:35:54 np0005535656 python3.9[104097]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764095753.571416-878-149897068738873/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:35:55 np0005535656 python3.9[104173]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:35:55 np0005535656 systemd[1]: Reloading.
Nov 25 13:35:55 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:35:55 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:35:56 np0005535656 python3.9[104284]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:35:56 np0005535656 systemd[1]: Reloading.
Nov 25 13:35:56 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:35:56 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:35:56 np0005535656 systemd[1]: Starting ovn_metadata_agent container...
Nov 25 13:35:56 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:35:56 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b80b874e701f151816ddb41368dc4791419d4f98baa304c899eba1596c463d0b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 25 13:35:56 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b80b874e701f151816ddb41368dc4791419d4f98baa304c899eba1596c463d0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 13:35:56 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f.
Nov 25 13:35:56 np0005535656 podman[104325]: 2025-11-25 18:35:56.933828984 +0000 UTC m=+0.259652581 container init e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:35:56 np0005535656 ovn_metadata_agent[104341]: + sudo -E kolla_set_configs
Nov 25 13:35:56 np0005535656 podman[104325]: 2025-11-25 18:35:56.971667415 +0000 UTC m=+0.297491002 container start e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 25 13:35:57 np0005535656 edpm-start-podman-container[104325]: ovn_metadata_agent
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Validating config file
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Copying service configuration files
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Writing out command to execute
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: ++ cat /run_command
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + CMD=neutron-ovn-metadata-agent
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + ARGS=
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + sudo kolla_copy_cacerts
Nov 25 13:35:57 np0005535656 edpm-start-podman-container[104324]: Creating additional drop-in dependency for "ovn_metadata_agent" (e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f)
Nov 25 13:35:57 np0005535656 podman[104348]: 2025-11-25 18:35:57.107769651 +0000 UTC m=+0.114865329 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + [[ ! -n '' ]]
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + . kolla_extend_start
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: Running command: 'neutron-ovn-metadata-agent'
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + umask 0022
Nov 25 13:35:57 np0005535656 ovn_metadata_agent[104341]: + exec neutron-ovn-metadata-agent
Nov 25 13:35:57 np0005535656 systemd[1]: Reloading.
Nov 25 13:35:57 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:35:57 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:35:57 np0005535656 systemd[1]: Started ovn_metadata_agent container.
Nov 25 13:35:58 np0005535656 systemd[1]: session-23.scope: Deactivated successfully.
Nov 25 13:35:58 np0005535656 systemd[1]: session-23.scope: Consumed 41.358s CPU time.
Nov 25 13:35:58 np0005535656 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Nov 25 13:35:58 np0005535656 systemd-logind[788]: Removed session 23.
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.007 104346 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.007 104346 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.008 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.009 104346 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.010 104346 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.011 104346 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.012 104346 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.013 104346 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.014 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.015 104346 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.016 104346 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.017 104346 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.018 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.019 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.020 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.021 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.022 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.023 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.024 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.025 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.026 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.027 104346 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.028 104346 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.029 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.030 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.031 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.032 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.033 104346 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.034 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.035 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.036 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.037 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.038 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.039 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.040 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.041 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.041 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.041 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.041 104346 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.041 104346 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.051 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.051 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.052 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.052 104346 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.052 104346 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.066 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 0dba517c-b8b5-44c5-b9d2-340b509da9f7 (UUID: 0dba517c-b8b5-44c5-b9d2-340b509da9f7) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.095 104346 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.096 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.096 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.096 104346 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.099 104346 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.107 104346 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.113 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '0dba517c-b8b5-44c5-b9d2-340b509da9f7'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], external_ids={}, name=0dba517c-b8b5-44c5-b9d2-340b509da9f7, nb_cfg_timestamp=1764095701826, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.114 104346 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f89972f3b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.115 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.115 104346 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.115 104346 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.116 104346 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.121 104346 DEBUG oslo_service.service [-] Started child 104451 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.125 104346 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpkco5_9e0/privsep.sock']#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.127 104451 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-497296'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.165 104451 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.166 104451 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.166 104451 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.171 104451 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.180 104451 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.193 104451 INFO eventlet.wsgi.server [-] (104451) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 25 13:35:59 np0005535656 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.905 104346 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.907 104346 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkco5_9e0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.702 104456 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.710 104456 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.714 104456 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.714 104456 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104456#033[00m
Nov 25 13:35:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:35:59.912 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[a92b0303-e5a8-4e4d-b260-26918d0f5f31]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.440 104456 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.440 104456 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.440 104456 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.936 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[dce80118-16a4-42e3-ba12-6036a3c5723e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.940 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, column=external_ids, values=({'neutron:ovn-metadata-id': 'f9de11b9-668a-5430-bf5a-3d1455ebc9c3'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.948 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.954 104346 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.954 104346 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.954 104346 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.955 104346 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.956 104346 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.957 104346 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.958 104346 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.959 104346 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.959 104346 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.959 104346 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.959 104346 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.959 104346 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.960 104346 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.961 104346 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.961 104346 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.961 104346 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.961 104346 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.961 104346 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.962 104346 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.963 104346 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.964 104346 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.965 104346 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.966 104346 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.967 104346 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.968 104346 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.969 104346 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.970 104346 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.971 104346 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.972 104346 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.973 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.974 104346 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.975 104346 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.976 104346 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.977 104346 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.978 104346 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.979 104346 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.980 104346 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.981 104346 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.982 104346 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.983 104346 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.984 104346 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.985 104346 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.986 104346 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.987 104346 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.988 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.989 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.990 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.991 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:36:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:00.992 104346 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 25 13:36:03 np0005535656 systemd-logind[788]: New session 24 of user zuul.
Nov 25 13:36:03 np0005535656 systemd[1]: Started Session 24 of User zuul.
Nov 25 13:36:04 np0005535656 python3.9[104614]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:36:06 np0005535656 python3.9[104770]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:07 np0005535656 python3.9[104935]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:36:07 np0005535656 systemd[1]: Reloading.
Nov 25 13:36:07 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:36:07 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:36:08 np0005535656 python3.9[105120]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:36:08 np0005535656 network[105137]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:36:08 np0005535656 network[105138]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:36:08 np0005535656 network[105139]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:36:14 np0005535656 python3.9[105400]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:15 np0005535656 python3.9[105553]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:15 np0005535656 python3.9[105706]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:16 np0005535656 python3.9[105859]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:17 np0005535656 python3.9[106012]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:19 np0005535656 python3.9[106165]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:20 np0005535656 python3.9[106318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:36:21 np0005535656 python3.9[106471]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:22 np0005535656 python3.9[106623]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:22 np0005535656 python3.9[106775]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:23 np0005535656 python3.9[106927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:24 np0005535656 podman[107051]: 2025-11-25 18:36:24.371481703 +0000 UTC m=+0.113613742 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 25 13:36:24 np0005535656 python3.9[107097]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:25 np0005535656 python3.9[107255]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:25 np0005535656 python3.9[107407]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:27 np0005535656 python3.9[107559]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:27 np0005535656 podman[107683]: 2025-11-25 18:36:27.631152082 +0000 UTC m=+0.077842878 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 25 13:36:28 np0005535656 python3.9[107730]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:28 np0005535656 python3.9[107882]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:29 np0005535656 python3.9[108034]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:30 np0005535656 python3.9[108186]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:30 np0005535656 python3.9[108338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:31 np0005535656 python3.9[108490]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:36:32 np0005535656 python3.9[108642]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:33 np0005535656 python3.9[108794]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:36:34 np0005535656 python3.9[108946]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:36:34 np0005535656 systemd[1]: Reloading.
Nov 25 13:36:34 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:36:34 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:36:35 np0005535656 python3.9[109135]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:36 np0005535656 python3.9[109288]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:37 np0005535656 python3.9[109441]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:37 np0005535656 python3.9[109594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:38 np0005535656 python3.9[109747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:39 np0005535656 python3.9[109900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:40 np0005535656 python3.9[110053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:36:41 np0005535656 python3.9[110206]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 25 13:36:42 np0005535656 python3.9[110359]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:36:44 np0005535656 python3.9[110517]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 13:36:45 np0005535656 python3.9[110677]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:36:46 np0005535656 python3.9[110761]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:36:55 np0005535656 podman[110786]: 2025-11-25 18:36:55.090390716 +0000 UTC m=+0.186258639 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 13:36:57 np0005535656 podman[110905]: 2025-11-25 18:36:57.943130756 +0000 UTC m=+0.062440118 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:36:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:59.044 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:36:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:59.045 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:36:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:36:59.045 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:37:15 np0005535656 kernel: SELinux:  Converting 2758 SID table entries...
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:37:15 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:37:25 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 25 13:37:26 np0005535656 podman[111009]: 2025-11-25 18:37:26.057780147 +0000 UTC m=+0.136361490 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 13:37:27 np0005535656 kernel: SELinux:  Converting 2758 SID table entries...
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:37:27 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:37:28 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 25 13:37:28 np0005535656 podman[111042]: 2025-11-25 18:37:28.972427011 +0000 UTC m=+0.082548278 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 13:37:57 np0005535656 podman[120074]: 2025-11-25 18:37:57.012057901 +0000 UTC m=+0.122131226 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 13:37:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:37:59.045 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:37:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:37:59.046 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:37:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:37:59.046 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:37:59 np0005535656 podman[121529]: 2025-11-25 18:37:59.963210494 +0000 UTC m=+0.072976717 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 13:38:26 np0005535656 kernel: SELinux:  Converting 2759 SID table entries...
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability open_perms=1
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability always_check_network=0
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 13:38:26 np0005535656 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 13:38:27 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 25 13:38:27 np0005535656 podman[127915]: 2025-11-25 18:38:27.457718398 +0000 UTC m=+0.114138014 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 13:38:27 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:38:27 np0005535656 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Nov 25 13:38:30 np0005535656 podman[128001]: 2025-11-25 18:38:30.989447325 +0000 UTC m=+0.088107221 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 25 13:38:35 np0005535656 systemd[1]: Stopping OpenSSH server daemon...
Nov 25 13:38:35 np0005535656 systemd[1]: sshd.service: Deactivated successfully.
Nov 25 13:38:35 np0005535656 systemd[1]: Stopped OpenSSH server daemon.
Nov 25 13:38:35 np0005535656 systemd[1]: sshd.service: Consumed 2.117s CPU time, read 32.0K from disk, written 8.0K to disk.
Nov 25 13:38:35 np0005535656 systemd[1]: Stopped target sshd-keygen.target.
Nov 25 13:38:35 np0005535656 systemd[1]: Stopping sshd-keygen.target...
Nov 25 13:38:35 np0005535656 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 13:38:35 np0005535656 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 13:38:35 np0005535656 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 13:38:35 np0005535656 systemd[1]: Reached target sshd-keygen.target.
Nov 25 13:38:35 np0005535656 systemd[1]: Starting OpenSSH server daemon...
Nov 25 13:38:35 np0005535656 systemd[1]: Started OpenSSH server daemon.
Nov 25 13:38:38 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:38:38 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:38:38 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:38 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:38 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:38 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:38:42 np0005535656 python3.9[132393]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:38:42 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:42 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:42 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:43 np0005535656 python3.9[133672]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:38:44 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:44 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:44 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:45 np0005535656 python3.9[134732]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:38:45 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:45 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:45 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:46 np0005535656 python3.9[135760]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:38:46 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:46 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:46 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:49 np0005535656 python3.9[137988]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:49 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:49 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:49 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:49 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:38:49 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:38:49 np0005535656 systemd[1]: man-db-cache-update.service: Consumed 14.046s CPU time.
Nov 25 13:38:49 np0005535656 systemd[1]: run-r31c0eaf2dee94e4eb4dd27ef4e3b371f.service: Deactivated successfully.
Nov 25 13:38:50 np0005535656 python3.9[138458]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:50 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:50 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:50 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:51 np0005535656 python3.9[138648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:51 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:51 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:51 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:52 np0005535656 python3.9[138838]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:54 np0005535656 python3.9[138993]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:54 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:54 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:54 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:55 np0005535656 python3.9[139184]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 13:38:56 np0005535656 systemd[1]: Reloading.
Nov 25 13:38:56 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:38:56 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:38:56 np0005535656 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 25 13:38:57 np0005535656 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 25 13:38:57 np0005535656 podman[139350]: 2025-11-25 18:38:57.751718055 +0000 UTC m=+0.116206130 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:38:58 np0005535656 python3.9[139398]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:58 np0005535656 python3.9[139559]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:38:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:38:59.047 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:38:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:38:59.047 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:38:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:38:59.048 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:39:00 np0005535656 python3.9[139714]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:01 np0005535656 python3.9[139869]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:01 np0005535656 podman[139871]: 2025-11-25 18:39:01.173149242 +0000 UTC m=+0.060483355 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 25 13:39:02 np0005535656 python3.9[140041]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:03 np0005535656 python3.9[140196]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:05 np0005535656 python3.9[140351]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:06 np0005535656 python3.9[140506]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:07 np0005535656 python3.9[140661]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:08 np0005535656 python3.9[140816]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:09 np0005535656 python3.9[140971]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:10 np0005535656 python3.9[141126]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:11 np0005535656 python3.9[141281]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:12 np0005535656 python3.9[141436]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 13:39:15 np0005535656 python3.9[141591]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:16 np0005535656 python3.9[141743]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:16 np0005535656 python3.9[141895]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:17 np0005535656 python3.9[142047]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:18 np0005535656 python3.9[142199]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:19 np0005535656 python3.9[142351]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:39:20 np0005535656 python3.9[142503]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:21 np0005535656 python3.9[142628]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095959.6838937-1094-229466533357061/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:22 np0005535656 python3.9[142780]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:22 np0005535656 python3.9[142905]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095961.538474-1094-110265733255839/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:23 np0005535656 python3.9[143057]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:24 np0005535656 python3.9[143182]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095962.9612026-1094-175826413658782/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:25 np0005535656 python3.9[143334]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:25 np0005535656 python3.9[143459]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095964.487162-1094-155933395173375/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:26 np0005535656 python3.9[143611]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:27 np0005535656 python3.9[143736]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095965.8652058-1094-135403315817246/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:27 np0005535656 python3.9[143888]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:28 np0005535656 podman[143896]: 2025-11-25 18:39:28.02681031 +0000 UTC m=+0.133824575 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:39:28 np0005535656 python3.9[144039]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095967.2467632-1094-89556656388649/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:29 np0005535656 python3.9[144191]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:30 np0005535656 python3.9[144314]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095968.6648245-1094-101929518702564/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:30 np0005535656 python3.9[144466]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:31 np0005535656 podman[144563]: 2025-11-25 18:39:31.360734048 +0000 UTC m=+0.071758474 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 25 13:39:31 np0005535656 python3.9[144611]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764095970.2153192-1094-5772731323118/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:32 np0005535656 python3.9[144764]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 25 13:39:33 np0005535656 python3.9[144917]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:34 np0005535656 python3.9[145069]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:35 np0005535656 python3.9[145221]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:35 np0005535656 python3.9[145373]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:36 np0005535656 python3.9[145525]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:37 np0005535656 python3.9[145677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:38 np0005535656 python3.9[145829]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:38 np0005535656 python3.9[145981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:39 np0005535656 python3.9[146133]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:40 np0005535656 python3.9[146285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:41 np0005535656 python3.9[146437]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:41 np0005535656 python3.9[146589]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:42 np0005535656 python3.9[146741]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:43 np0005535656 python3.9[146893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:45 np0005535656 python3.9[147045]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:45 np0005535656 python3.9[147168]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095984.6210048-1536-258799209532418/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:46 np0005535656 python3.9[147320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:47 np0005535656 python3.9[147443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095986.238689-1536-145119382013907/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:48 np0005535656 python3.9[147597]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:48 np0005535656 python3.9[147720]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095987.7153306-1536-261891415565407/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:49 np0005535656 python3.9[147872]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:50 np0005535656 python3.9[147995]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095989.1330614-1536-103801991328407/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:51 np0005535656 python3.9[148147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:51 np0005535656 python3.9[148270]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095990.6260512-1536-238516624870680/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:52 np0005535656 python3.9[148422]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:53 np0005535656 python3.9[148545]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095992.1546838-1536-54051138869980/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:54 np0005535656 python3.9[148697]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:55 np0005535656 python3.9[148820]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095993.7728004-1536-44732101066503/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:55 np0005535656 python3.9[148972]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:56 np0005535656 python3.9[149095]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095995.4468222-1536-278617334397924/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:57 np0005535656 python3.9[149247]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:39:58 np0005535656 podman[149342]: 2025-11-25 18:39:58.362259885 +0000 UTC m=+0.088091017 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 25 13:39:58 np0005535656 python3.9[149390]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095996.9862761-1536-257096294653490/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:39:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:39:59.048 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:39:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:39:59.049 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:39:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:39:59.049 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:39:59 np0005535656 python3.9[149549]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:00 np0005535656 python3.9[149672]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095998.7569206-1536-223807284724998/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:00 np0005535656 python3.9[149824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:01 np0005535656 podman[149919]: 2025-11-25 18:40:01.644506512 +0000 UTC m=+0.057968568 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:40:01 np0005535656 python3.9[149967]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096000.2331855-1536-264396080706685/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:02 np0005535656 python3.9[150119]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:03 np0005535656 python3.9[150242]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096002.0397608-1536-30733711055529/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:04 np0005535656 python3.9[150394]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:05 np0005535656 python3.9[150517]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096003.8472955-1536-194159263848097/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:06 np0005535656 python3.9[150669]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:07 np0005535656 python3.9[150792]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096005.6465833-1536-36639700054896/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:11 np0005535656 python3.9[150942]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:40:12 np0005535656 python3.9[151097]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 25 13:40:17 np0005535656 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 25 13:40:17 np0005535656 python3.9[151253]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:18 np0005535656 python3.9[151405]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:18 np0005535656 python3.9[151557]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:19 np0005535656 python3.9[151709]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:20 np0005535656 python3.9[151861]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:21 np0005535656 python3.9[152013]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:22 np0005535656 python3.9[152165]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:23 np0005535656 python3.9[152317]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:24 np0005535656 python3.9[152469]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:24 np0005535656 python3.9[152621]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:26 np0005535656 python3.9[152773]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:40:26 np0005535656 systemd[1]: Reloading.
Nov 25 13:40:26 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:40:26 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:40:26 np0005535656 systemd[1]: Starting libvirt logging daemon socket...
Nov 25 13:40:26 np0005535656 systemd[1]: Listening on libvirt logging daemon socket.
Nov 25 13:40:26 np0005535656 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 25 13:40:26 np0005535656 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 25 13:40:26 np0005535656 systemd[1]: Starting libvirt logging daemon...
Nov 25 13:40:26 np0005535656 systemd[1]: Started libvirt logging daemon.
Nov 25 13:40:27 np0005535656 python3.9[152966]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:40:27 np0005535656 systemd[1]: Reloading.
Nov 25 13:40:27 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:40:27 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:40:28 np0005535656 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 25 13:40:28 np0005535656 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 25 13:40:28 np0005535656 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 25 13:40:28 np0005535656 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 25 13:40:28 np0005535656 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 25 13:40:28 np0005535656 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 25 13:40:28 np0005535656 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 25 13:40:28 np0005535656 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 13:40:28 np0005535656 systemd[1]: Started libvirt nodedev daemon.
Nov 25 13:40:28 np0005535656 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 25 13:40:28 np0005535656 podman[153100]: 2025-11-25 18:40:28.521690552 +0000 UTC m=+0.119965432 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 13:40:28 np0005535656 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 25 13:40:28 np0005535656 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 25 13:40:28 np0005535656 python3.9[153217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:40:29 np0005535656 systemd[1]: Reloading.
Nov 25 13:40:29 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:40:29 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:40:29 np0005535656 setroubleshoot[153004]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f8c8c3ee-1124-4a4f-a605-baa9c0bf76e5
Nov 25 13:40:29 np0005535656 setroubleshoot[153004]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 25 13:40:29 np0005535656 setroubleshoot[153004]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f8c8c3ee-1124-4a4f-a605-baa9c0bf76e5
Nov 25 13:40:29 np0005535656 setroubleshoot[153004]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 25 13:40:30 np0005535656 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 25 13:40:30 np0005535656 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 25 13:40:30 np0005535656 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 25 13:40:30 np0005535656 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 25 13:40:30 np0005535656 systemd[1]: Starting libvirt proxy daemon...
Nov 25 13:40:30 np0005535656 systemd[1]: Started libvirt proxy daemon.
Nov 25 13:40:31 np0005535656 python3.9[153430]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:40:31 np0005535656 systemd[1]: Reloading.
Nov 25 13:40:31 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:40:31 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:40:31 np0005535656 systemd[1]: Listening on libvirt locking daemon socket.
Nov 25 13:40:31 np0005535656 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 25 13:40:31 np0005535656 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 25 13:40:31 np0005535656 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 25 13:40:31 np0005535656 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 25 13:40:31 np0005535656 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 25 13:40:31 np0005535656 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 25 13:40:31 np0005535656 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 25 13:40:31 np0005535656 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 25 13:40:31 np0005535656 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 25 13:40:31 np0005535656 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 13:40:31 np0005535656 podman[153468]: 2025-11-25 18:40:31.842267875 +0000 UTC m=+0.059260572 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 13:40:31 np0005535656 systemd[1]: Started libvirt QEMU daemon.
Nov 25 13:40:32 np0005535656 python3.9[153666]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:40:32 np0005535656 systemd[1]: Reloading.
Nov 25 13:40:32 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:40:32 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:40:32 np0005535656 systemd[1]: Starting libvirt secret daemon socket...
Nov 25 13:40:32 np0005535656 systemd[1]: Listening on libvirt secret daemon socket.
Nov 25 13:40:32 np0005535656 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 25 13:40:33 np0005535656 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 25 13:40:33 np0005535656 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 25 13:40:33 np0005535656 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 25 13:40:33 np0005535656 systemd[1]: Starting libvirt secret daemon...
Nov 25 13:40:33 np0005535656 systemd[1]: Started libvirt secret daemon.
Nov 25 13:40:34 np0005535656 python3.9[153878]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:35 np0005535656 python3.9[154030]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:40:36 np0005535656 python3.9[154182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:37 np0005535656 python3.9[154305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096035.9011378-2227-115904308983027/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:38 np0005535656 python3.9[154458]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:39 np0005535656 python3.9[154610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:39 np0005535656 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 25 13:40:39 np0005535656 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 25 13:40:40 np0005535656 python3.9[154688]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:40 np0005535656 python3.9[154840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:41 np0005535656 python3.9[154918]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.yweq3_7f recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:42 np0005535656 python3.9[155070]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:43 np0005535656 python3.9[155148]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:44 np0005535656 python3.9[155300]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:40:45 np0005535656 python3[155453]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 13:40:46 np0005535656 python3.9[155605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:46 np0005535656 python3.9[155683]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:47 np0005535656 python3.9[155835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:48 np0005535656 python3.9[155913]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:48 np0005535656 python3.9[156065]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:49 np0005535656 python3.9[156143]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:50 np0005535656 python3.9[156295]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:51 np0005535656 python3.9[156373]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:52 np0005535656 python3.9[156525]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:40:52 np0005535656 python3.9[156650]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096051.5089624-2476-42138252835696/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:54 np0005535656 python3.9[156802]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:55 np0005535656 python3.9[156954]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:40:56 np0005535656 python3.9[157109]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:40:57 np0005535656 python3.9[157261]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:40:58 np0005535656 python3.9[157414]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:40:58 np0005535656 podman[157540]: 2025-11-25 18:40:58.871740508 +0000 UTC m=+0.145048746 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:40:58 np0005535656 python3.9[157582]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:40:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:40:59.049 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:40:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:40:59.050 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:40:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:40:59.050 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:41:00 np0005535656 python3.9[157750]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:01 np0005535656 python3.9[157902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:01 np0005535656 python3.9[158025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096060.4739032-2620-152068501336363/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:01 np0005535656 podman[158050]: 2025-11-25 18:41:01.933605026 +0000 UTC m=+0.056043736 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 25 13:41:02 np0005535656 python3.9[158198]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:03 np0005535656 python3.9[158321]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096062.18629-2651-118279796008187/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:04 np0005535656 python3.9[158473]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:05 np0005535656 python3.9[158596]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096063.9420948-2681-195000945483404/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:06 np0005535656 python3.9[158748]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:41:06 np0005535656 systemd[1]: Reloading.
Nov 25 13:41:06 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:41:06 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:41:06 np0005535656 systemd[1]: Reached target edpm_libvirt.target.
Nov 25 13:41:07 np0005535656 python3.9[158939]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 13:41:07 np0005535656 systemd[1]: Reloading.
Nov 25 13:41:08 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:41:08 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:41:08 np0005535656 systemd[1]: Reloading.
Nov 25 13:41:08 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:41:08 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:41:09 np0005535656 systemd[1]: session-24.scope: Deactivated successfully.
Nov 25 13:41:09 np0005535656 systemd[1]: session-24.scope: Consumed 3min 56.202s CPU time.
Nov 25 13:41:09 np0005535656 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Nov 25 13:41:09 np0005535656 systemd-logind[788]: Removed session 24.
Nov 25 13:41:14 np0005535656 systemd-logind[788]: New session 25 of user zuul.
Nov 25 13:41:14 np0005535656 systemd[1]: Started Session 25 of User zuul.
Nov 25 13:41:15 np0005535656 python3.9[159193]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:41:17 np0005535656 python3.9[159347]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:41:17 np0005535656 network[159364]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:41:17 np0005535656 network[159365]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:41:17 np0005535656 network[159366]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:41:22 np0005535656 python3.9[159637]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 13:41:24 np0005535656 python3.9[159721]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:41:29 np0005535656 podman[159846]: 2025-11-25 18:41:29.976389107 +0000 UTC m=+0.173601176 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 25 13:41:30 np0005535656 python3.9[159896]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:41:31 np0005535656 python3.9[160054]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:41:32 np0005535656 podman[160179]: 2025-11-25 18:41:32.164789231 +0000 UTC m=+0.089530573 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:41:32 np0005535656 python3.9[160220]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:41:33 np0005535656 python3.9[160376]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:41:33 np0005535656 python3.9[160529]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:34 np0005535656 python3.9[160652]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096093.4111974-175-112411185192323/.source.iscsi _original_basename=.gmu0jmgu follow=False checksum=7bd493efb00c6e1990885b1db162387d24bd6f5e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:35 np0005535656 python3.9[160804]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:36 np0005535656 python3.9[160956]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:36 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:41:36 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:41:36 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:41:38 np0005535656 python3.9[161109]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:41:38 np0005535656 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 25 13:41:39 np0005535656 python3.9[161265]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:41:39 np0005535656 systemd[1]: Reloading.
Nov 25 13:41:39 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:41:39 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:41:39 np0005535656 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 13:41:39 np0005535656 systemd[1]: Starting Open-iSCSI...
Nov 25 13:41:39 np0005535656 kernel: Loading iSCSI transport class v2.0-870.
Nov 25 13:41:39 np0005535656 systemd[1]: Started Open-iSCSI.
Nov 25 13:41:39 np0005535656 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 25 13:41:39 np0005535656 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 25 13:41:41 np0005535656 python3.9[161464]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:41:41 np0005535656 network[161481]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:41:41 np0005535656 network[161482]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:41:41 np0005535656 network[161483]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:41:48 np0005535656 python3.9[161754]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 13:41:49 np0005535656 python3.9[161906]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 25 13:41:50 np0005535656 python3.9[162062]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:50 np0005535656 python3.9[162185]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096109.5952725-329-1722180797351/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:51 np0005535656 python3.9[162337]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:53 np0005535656 python3.9[162489]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:41:53 np0005535656 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 13:41:53 np0005535656 systemd[1]: Stopped Load Kernel Modules.
Nov 25 13:41:53 np0005535656 systemd[1]: Stopping Load Kernel Modules...
Nov 25 13:41:53 np0005535656 systemd[1]: Starting Load Kernel Modules...
Nov 25 13:41:53 np0005535656 systemd[1]: Finished Load Kernel Modules.
Nov 25 13:41:54 np0005535656 python3.9[162645]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:41:55 np0005535656 python3.9[162797]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:41:55 np0005535656 python3.9[162949]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:41:56 np0005535656 python3.9[163101]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:41:57 np0005535656 python3.9[163224]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096116.2533324-445-68097536413008/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:41:58 np0005535656 python3.9[163376]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:41:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:41:59.050 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:41:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:41:59.052 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:41:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:41:59.052 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:41:59 np0005535656 python3.9[163529]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:00 np0005535656 podman[163681]: 2025-11-25 18:42:00.284810402 +0000 UTC m=+0.195405158 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller)
Nov 25 13:42:00 np0005535656 python3.9[163682]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:01 np0005535656 python3.9[163858]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:02 np0005535656 python3.9[164010]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:02 np0005535656 podman[164134]: 2025-11-25 18:42:02.803564694 +0000 UTC m=+0.085435127 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:42:03 np0005535656 python3.9[164181]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:03 np0005535656 python3.9[164333]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:04 np0005535656 python3.9[164485]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:05 np0005535656 python3.9[164637]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:42:06 np0005535656 python3.9[164791]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:07 np0005535656 python3.9[164943]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:08 np0005535656 python3.9[165095]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:08 np0005535656 python3.9[165173]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:09 np0005535656 python3.9[165325]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:09 np0005535656 python3.9[165403]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:10 np0005535656 python3.9[165555]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:11 np0005535656 python3.9[165707]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:11 np0005535656 python3.9[165785]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:12 np0005535656 python3.9[165937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:13 np0005535656 python3.9[166015]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:14 np0005535656 python3.9[166167]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:42:14 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:14 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:14 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:15 np0005535656 python3.9[166357]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:16 np0005535656 python3.9[166435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:17 np0005535656 python3.9[166587]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:17 np0005535656 python3.9[166665]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:18 np0005535656 python3.9[166817]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:42:18 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:18 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:18 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:19 np0005535656 systemd[1]: Starting Create netns directory...
Nov 25 13:42:19 np0005535656 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 13:42:19 np0005535656 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 13:42:19 np0005535656 systemd[1]: Finished Create netns directory.
Nov 25 13:42:20 np0005535656 python3.9[167011]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:20 np0005535656 python3.9[167163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:21 np0005535656 python3.9[167286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096140.3549929-859-246545633901408/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:22 np0005535656 python3.9[167438]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:42:23 np0005535656 python3.9[167590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:24 np0005535656 python3.9[167713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096143.1255016-909-260307941170155/.source.json _original_basename=.4tu7igxm follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:25 np0005535656 python3.9[167865]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:28 np0005535656 python3.9[168292]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 25 13:42:28 np0005535656 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 25 13:42:29 np0005535656 python3.9[168445]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:42:30 np0005535656 python3.9[168597]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 13:42:30 np0005535656 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 13:42:30 np0005535656 podman[168612]: 2025-11-25 18:42:30.68375385 +0000 UTC m=+0.153241475 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 25 13:42:31 np0005535656 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 25 13:42:32 np0005535656 python3[168803]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:42:32 np0005535656 podman[168841]: 2025-11-25 18:42:32.670317297 +0000 UTC m=+0.092546380 container create 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 13:42:32 np0005535656 podman[168841]: 2025-11-25 18:42:32.60183366 +0000 UTC m=+0.024062743 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 25 13:42:32 np0005535656 python3[168803]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 25 13:42:32 np0005535656 podman[168887]: 2025-11-25 18:42:32.98586798 +0000 UTC m=+0.101525653 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 13:42:33 np0005535656 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 25 13:42:33 np0005535656 python3.9[169053]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:42:34 np0005535656 python3.9[169207]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:35 np0005535656 python3.9[169283]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:42:36 np0005535656 python3.9[169434]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096155.4558425-1085-70192615534718/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:36 np0005535656 python3.9[169510]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:42:36 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:37 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:37 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:37 np0005535656 python3.9[169621]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:42:37 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:38 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:38 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:38 np0005535656 systemd[1]: Starting multipathd container...
Nov 25 13:42:38 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:42:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92d28838bc303e8694eb4c1df6703a68a90a1524a1873177b3395862b90b20c8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 13:42:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92d28838bc303e8694eb4c1df6703a68a90a1524a1873177b3395862b90b20c8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 13:42:38 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.
Nov 25 13:42:38 np0005535656 podman[169662]: 2025-11-25 18:42:38.457024186 +0000 UTC m=+0.159130135 container init 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:42:38 np0005535656 multipathd[169678]: + sudo -E kolla_set_configs
Nov 25 13:42:38 np0005535656 podman[169662]: 2025-11-25 18:42:38.499228489 +0000 UTC m=+0.201334418 container start 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:42:38 np0005535656 podman[169662]: multipathd
Nov 25 13:42:38 np0005535656 systemd[1]: Started multipathd container.
Nov 25 13:42:38 np0005535656 multipathd[169678]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:42:38 np0005535656 multipathd[169678]: INFO:__main__:Validating config file
Nov 25 13:42:38 np0005535656 multipathd[169678]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:42:38 np0005535656 multipathd[169678]: INFO:__main__:Writing out command to execute
Nov 25 13:42:38 np0005535656 podman[169685]: 2025-11-25 18:42:38.582045994 +0000 UTC m=+0.066664397 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 25 13:42:38 np0005535656 multipathd[169678]: ++ cat /run_command
Nov 25 13:42:38 np0005535656 multipathd[169678]: + CMD='/usr/sbin/multipathd -d'
Nov 25 13:42:38 np0005535656 multipathd[169678]: + ARGS=
Nov 25 13:42:38 np0005535656 multipathd[169678]: + sudo kolla_copy_cacerts
Nov 25 13:42:38 np0005535656 systemd[1]: 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-4f27c0caf3d99411.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 13:42:38 np0005535656 systemd[1]: 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-4f27c0caf3d99411.service: Failed with result 'exit-code'.
Nov 25 13:42:38 np0005535656 multipathd[169678]: + [[ ! -n '' ]]
Nov 25 13:42:38 np0005535656 multipathd[169678]: + . kolla_extend_start
Nov 25 13:42:38 np0005535656 multipathd[169678]: Running command: '/usr/sbin/multipathd -d'
Nov 25 13:42:38 np0005535656 multipathd[169678]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 13:42:38 np0005535656 multipathd[169678]: + umask 0022
Nov 25 13:42:38 np0005535656 multipathd[169678]: + exec /usr/sbin/multipathd -d
Nov 25 13:42:38 np0005535656 multipathd[169678]: 3120.203636 | --------start up--------
Nov 25 13:42:38 np0005535656 multipathd[169678]: 3120.203657 | read /etc/multipath.conf
Nov 25 13:42:38 np0005535656 multipathd[169678]: 3120.211392 | path checkers start up
Nov 25 13:42:39 np0005535656 python3.9[169867]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:42:40 np0005535656 python3.9[170021]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:42:41 np0005535656 python3.9[170188]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:42:41 np0005535656 systemd[1]: Stopping multipathd container...
Nov 25 13:42:41 np0005535656 multipathd[169678]: 3123.339750 | exit (signal)
Nov 25 13:42:41 np0005535656 multipathd[169678]: 3123.341234 | --------shut down-------
Nov 25 13:42:41 np0005535656 systemd[1]: libpod-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.scope: Deactivated successfully.
Nov 25 13:42:41 np0005535656 podman[170192]: 2025-11-25 18:42:41.790210952 +0000 UTC m=+0.083920855 container died 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 13:42:41 np0005535656 systemd[1]: 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-4f27c0caf3d99411.timer: Deactivated successfully.
Nov 25 13:42:41 np0005535656 systemd[1]: Stopped /usr/bin/podman healthcheck run 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.
Nov 25 13:42:41 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-userdata-shm.mount: Deactivated successfully.
Nov 25 13:42:41 np0005535656 systemd[1]: var-lib-containers-storage-overlay-92d28838bc303e8694eb4c1df6703a68a90a1524a1873177b3395862b90b20c8-merged.mount: Deactivated successfully.
Nov 25 13:42:41 np0005535656 podman[170192]: 2025-11-25 18:42:41.85467585 +0000 UTC m=+0.148385713 container cleanup 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 13:42:41 np0005535656 podman[170192]: multipathd
Nov 25 13:42:41 np0005535656 podman[170220]: multipathd
Nov 25 13:42:41 np0005535656 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 25 13:42:41 np0005535656 systemd[1]: Stopped multipathd container.
Nov 25 13:42:41 np0005535656 systemd[1]: Starting multipathd container...
Nov 25 13:42:42 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:42:42 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92d28838bc303e8694eb4c1df6703a68a90a1524a1873177b3395862b90b20c8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 13:42:42 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92d28838bc303e8694eb4c1df6703a68a90a1524a1873177b3395862b90b20c8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 13:42:42 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.
Nov 25 13:42:42 np0005535656 podman[170233]: 2025-11-25 18:42:42.120745591 +0000 UTC m=+0.128920315 container init 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 25 13:42:42 np0005535656 multipathd[170248]: + sudo -E kolla_set_configs
Nov 25 13:42:42 np0005535656 podman[170233]: 2025-11-25 18:42:42.151583567 +0000 UTC m=+0.159758301 container start 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 25 13:42:42 np0005535656 podman[170233]: multipathd
Nov 25 13:42:42 np0005535656 systemd[1]: Started multipathd container.
Nov 25 13:42:42 np0005535656 multipathd[170248]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:42:42 np0005535656 multipathd[170248]: INFO:__main__:Validating config file
Nov 25 13:42:42 np0005535656 multipathd[170248]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:42:42 np0005535656 multipathd[170248]: INFO:__main__:Writing out command to execute
Nov 25 13:42:42 np0005535656 multipathd[170248]: ++ cat /run_command
Nov 25 13:42:42 np0005535656 multipathd[170248]: + CMD='/usr/sbin/multipathd -d'
Nov 25 13:42:42 np0005535656 multipathd[170248]: + ARGS=
Nov 25 13:42:42 np0005535656 multipathd[170248]: + sudo kolla_copy_cacerts
Nov 25 13:42:42 np0005535656 podman[170255]: 2025-11-25 18:42:42.260105498 +0000 UTC m=+0.091067799 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 13:42:42 np0005535656 systemd[1]: 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-4867f0d091eba788.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 13:42:42 np0005535656 systemd[1]: 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60-4867f0d091eba788.service: Failed with result 'exit-code'.
Nov 25 13:42:42 np0005535656 multipathd[170248]: Running command: '/usr/sbin/multipathd -d'
Nov 25 13:42:42 np0005535656 multipathd[170248]: + [[ ! -n '' ]]
Nov 25 13:42:42 np0005535656 multipathd[170248]: + . kolla_extend_start
Nov 25 13:42:42 np0005535656 multipathd[170248]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 13:42:42 np0005535656 multipathd[170248]: + umask 0022
Nov 25 13:42:42 np0005535656 multipathd[170248]: + exec /usr/sbin/multipathd -d
Nov 25 13:42:42 np0005535656 multipathd[170248]: 3123.865020 | --------start up--------
Nov 25 13:42:42 np0005535656 multipathd[170248]: 3123.865038 | read /etc/multipath.conf
Nov 25 13:42:42 np0005535656 multipathd[170248]: 3123.872537 | path checkers start up
Nov 25 13:42:43 np0005535656 python3.9[170439]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:44 np0005535656 python3.9[170591]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 13:42:45 np0005535656 python3.9[170743]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 25 13:42:45 np0005535656 kernel: Key type psk registered
Nov 25 13:42:46 np0005535656 python3.9[170906]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:42:47 np0005535656 python3.9[171029]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096165.6605124-1245-102976303632451/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:48 np0005535656 python3.9[171181]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:49 np0005535656 python3.9[171333]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:42:49 np0005535656 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 13:42:49 np0005535656 systemd[1]: Stopped Load Kernel Modules.
Nov 25 13:42:49 np0005535656 systemd[1]: Stopping Load Kernel Modules...
Nov 25 13:42:49 np0005535656 systemd[1]: Starting Load Kernel Modules...
Nov 25 13:42:49 np0005535656 systemd[1]: Finished Load Kernel Modules.
Nov 25 13:42:50 np0005535656 python3.9[171489]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 13:42:52 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:52 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:52 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:53 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:53 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:53 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:53 np0005535656 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 13:42:53 np0005535656 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 13:42:53 np0005535656 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 13:42:53 np0005535656 systemd[1]: Starting man-db-cache-update.service...
Nov 25 13:42:53 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:54 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:42:54 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:54 np0005535656 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 13:42:55 np0005535656 python3.9[172939]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:42:55 np0005535656 systemd[1]: Stopping Open-iSCSI...
Nov 25 13:42:55 np0005535656 iscsid[161304]: iscsid shutting down.
Nov 25 13:42:55 np0005535656 systemd[1]: iscsid.service: Deactivated successfully.
Nov 25 13:42:55 np0005535656 systemd[1]: Stopped Open-iSCSI.
Nov 25 13:42:55 np0005535656 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 13:42:55 np0005535656 systemd[1]: Starting Open-iSCSI...
Nov 25 13:42:55 np0005535656 systemd[1]: Started Open-iSCSI.
Nov 25 13:42:56 np0005535656 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 13:42:56 np0005535656 systemd[1]: Finished man-db-cache-update.service.
Nov 25 13:42:56 np0005535656 systemd[1]: man-db-cache-update.service: Consumed 2.024s CPU time.
Nov 25 13:42:56 np0005535656 systemd[1]: run-r7cbbf5a86fe94dd3a363c06dd4777dd1.service: Deactivated successfully.
Nov 25 13:42:57 np0005535656 python3.9[173094]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:42:58 np0005535656 python3.9[173250]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:42:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:42:59.051 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:42:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:42:59.053 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:42:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:42:59.053 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:42:59 np0005535656 python3.9[173402]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:42:59 np0005535656 systemd[1]: Reloading.
Nov 25 13:42:59 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:42:59 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:43:00 np0005535656 python3.9[173587]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:43:00 np0005535656 network[173604]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:43:00 np0005535656 network[173605]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:43:00 np0005535656 network[173606]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:43:01 np0005535656 podman[173611]: 2025-11-25 18:43:01.041204168 +0000 UTC m=+0.177123238 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 13:43:03 np0005535656 podman[173700]: 2025-11-25 18:43:03.177491005 +0000 UTC m=+0.112312929 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 25 13:43:06 np0005535656 python3.9[173922]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:07 np0005535656 python3.9[174075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:08 np0005535656 python3.9[174228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:09 np0005535656 python3.9[174381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:10 np0005535656 python3.9[174534]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:11 np0005535656 python3.9[174687]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:12 np0005535656 python3.9[174840]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:12 np0005535656 podman[174965]: 2025-11-25 18:43:12.919087453 +0000 UTC m=+0.092715152 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:43:13 np0005535656 python3.9[175013]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:43:18 np0005535656 python3.9[175166]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:18 np0005535656 python3.9[175318]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:19 np0005535656 python3.9[175470]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:20 np0005535656 python3.9[175622]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:21 np0005535656 python3.9[175774]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:21 np0005535656 python3.9[175926]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:22 np0005535656 python3.9[176078]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:23 np0005535656 python3.9[176230]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:23 np0005535656 python3.9[176382]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:24 np0005535656 python3.9[176534]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:25 np0005535656 python3.9[176686]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:26 np0005535656 python3.9[176838]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:26 np0005535656 python3.9[176990]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:27 np0005535656 python3.9[177142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:28 np0005535656 python3.9[177294]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:28 np0005535656 python3.9[177446]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:43:30 np0005535656 python3.9[177598]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:31 np0005535656 python3.9[177750]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:43:32 np0005535656 podman[177828]: 2025-11-25 18:43:32.043710185 +0000 UTC m=+0.150678394 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:43:32 np0005535656 python3.9[177928]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:43:32 np0005535656 systemd[1]: Reloading.
Nov 25 13:43:32 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:43:32 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:43:33 np0005535656 podman[178088]: 2025-11-25 18:43:33.40015709 +0000 UTC m=+0.067950133 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 13:43:33 np0005535656 python3.9[178135]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:34 np0005535656 python3.9[178288]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:35 np0005535656 python3.9[178441]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:36 np0005535656 python3.9[178594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:36 np0005535656 python3.9[178747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:38 np0005535656 python3.9[178900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:39 np0005535656 python3.9[179053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:40 np0005535656 python3.9[179206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:43:42 np0005535656 python3.9[179359]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:42 np0005535656 python3.9[179511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:43 np0005535656 podman[179635]: 2025-11-25 18:43:43.403377254 +0000 UTC m=+0.096979767 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:43:43 np0005535656 python3.9[179683]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:44 np0005535656 python3.9[179835]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:45 np0005535656 python3.9[179987]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:46 np0005535656 python3.9[180139]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:46 np0005535656 python3.9[180291]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:47 np0005535656 python3.9[180443]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:48 np0005535656 python3.9[180595]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:49 np0005535656 python3.9[180747]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:43:54 np0005535656 python3.9[180899]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 25 13:43:55 np0005535656 python3.9[181052]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:43:56 np0005535656 python3.9[181210]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 13:43:58 np0005535656 systemd-logind[788]: New session 26 of user zuul.
Nov 25 13:43:58 np0005535656 systemd[1]: Started Session 26 of User zuul.
Nov 25 13:43:58 np0005535656 systemd[1]: session-26.scope: Deactivated successfully.
Nov 25 13:43:58 np0005535656 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Nov 25 13:43:58 np0005535656 systemd-logind[788]: Removed session 26.
Nov 25 13:43:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:43:59.052 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:43:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:43:59.053 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:43:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:43:59.053 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:43:59 np0005535656 python3.9[181396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:43:59 np0005535656 python3.9[181517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096238.5164242-2326-185110153452649/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:00 np0005535656 python3.9[181667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:00 np0005535656 python3.9[181743]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:01 np0005535656 python3.9[181893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:02 np0005535656 podman[181988]: 2025-11-25 18:44:02.261203318 +0000 UTC m=+0.134169067 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 25 13:44:02 np0005535656 python3.9[182028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096241.1718695-2326-115852401668485/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:03 np0005535656 python3.9[182188]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:03 np0005535656 podman[182283]: 2025-11-25 18:44:03.698845511 +0000 UTC m=+0.069239713 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 13:44:03 np0005535656 python3.9[182324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096242.6323018-2326-140982695856091/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:04 np0005535656 python3.9[182478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:05 np0005535656 python3.9[182601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096244.0905287-2326-260896605302637/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:06 np0005535656 python3.9[182751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:07 np0005535656 python3.9[182872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096245.8377812-2326-124500538381922/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:08 np0005535656 python3.9[183024]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:44:09 np0005535656 python3.9[183176]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:44:10 np0005535656 python3.9[183328]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:11 np0005535656 python3.9[183480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:12 np0005535656 python3.9[183603]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764096250.7510602-2540-250882910538145/.source _original_basename=.n8yioqx_ follow=False checksum=3c33f162ce10277f6ebfe976c24ece54a2836f5d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 25 13:44:12 np0005535656 python3.9[183755]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:13 np0005535656 podman[183881]: 2025-11-25 18:44:13.798808501 +0000 UTC m=+0.078390483 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd)
Nov 25 13:44:13 np0005535656 python3.9[183925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:14 np0005535656 python3.9[184049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096253.3879883-2592-223598328815213/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:15 np0005535656 python3.9[184199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:44:16 np0005535656 python3.9[184320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096254.8846142-2623-155305832510947/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:44:17 np0005535656 python3.9[184472]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 25 13:44:20 np0005535656 python3.9[184624]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:44:21 np0005535656 python3[184776]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:44:21 np0005535656 podman[184815]: 2025-11-25 18:44:21.492674925 +0000 UTC m=+0.070084496 container create 0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:44:21 np0005535656 podman[184815]: 2025-11-25 18:44:21.453129825 +0000 UTC m=+0.030539666 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 25 13:44:21 np0005535656 python3[184776]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 25 13:44:22 np0005535656 python3.9[185004]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:23 np0005535656 python3.9[185158]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 25 13:44:24 np0005535656 python3.9[185310]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:44:25 np0005535656 python3[185462]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:44:26 np0005535656 podman[185498]: 2025-11-25 18:44:26.01579087 +0000 UTC m=+0.071460063 container create e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:44:26 np0005535656 podman[185498]: 2025-11-25 18:44:25.974400869 +0000 UTC m=+0.030070082 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 25 13:44:26 np0005535656 python3[185462]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 25 13:44:27 np0005535656 python3.9[185688]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:28 np0005535656 python3.9[185842]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:44:28 np0005535656 python3.9[185993]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096268.128711-2806-173983089557675/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:44:29 np0005535656 python3.9[186069]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:44:29 np0005535656 systemd[1]: Reloading.
Nov 25 13:44:29 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:44:29 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:44:30 np0005535656 python3.9[186179]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:44:30 np0005535656 systemd[1]: Reloading.
Nov 25 13:44:30 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:44:30 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:44:30 np0005535656 systemd[1]: Starting nova_compute container...
Nov 25 13:44:30 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:44:30 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:30 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:30 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:30 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:30 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:31 np0005535656 podman[186218]: 2025-11-25 18:44:31.018136829 +0000 UTC m=+0.134566719 container init e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, container_name=nova_compute, io.buildah.version=1.41.3)
Nov 25 13:44:31 np0005535656 podman[186218]: 2025-11-25 18:44:31.024615565 +0000 UTC m=+0.141045435 container start e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118)
Nov 25 13:44:31 np0005535656 podman[186218]: nova_compute
Nov 25 13:44:31 np0005535656 systemd[1]: Started nova_compute container.
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + sudo -E kolla_set_configs
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Validating config file
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying service configuration files
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Deleting /etc/ceph
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Creating directory /etc/ceph
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Writing out command to execute
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:31 np0005535656 nova_compute[186233]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 13:44:31 np0005535656 nova_compute[186233]: ++ cat /run_command
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + CMD=nova-compute
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + ARGS=
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + sudo kolla_copy_cacerts
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + [[ ! -n '' ]]
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + . kolla_extend_start
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 13:44:31 np0005535656 nova_compute[186233]: Running command: 'nova-compute'
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + umask 0022
Nov 25 13:44:31 np0005535656 nova_compute[186233]: + exec nova-compute
Nov 25 13:44:32 np0005535656 python3.9[186395]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:33 np0005535656 podman[186471]: 2025-11-25 18:44:33.098292228 +0000 UTC m=+0.214128812 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.136 186237 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.136 186237 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.136 186237 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.136 186237 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.315 186237 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.343 186237 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.344 186237 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 25 13:44:33 np0005535656 python3.9[186573]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.861 186237 INFO nova.virt.driver [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 25 13:44:33 np0005535656 podman[186675]: 2025-11-25 18:44:33.932547124 +0000 UTC m=+0.056014691 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:44:33 np0005535656 nova_compute[186233]: 2025-11-25 18:44:33.988 186237 INFO nova.compute.provider_config [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.008 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.008 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.008 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.009 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.009 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.009 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.009 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.010 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.011 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.012 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.013 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.014 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.015 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.016 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.017 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.018 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.018 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.018 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.018 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.018 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.019 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.020 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.020 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.020 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.020 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.020 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.021 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.021 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.021 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.021 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.021 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.022 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.023 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.024 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.025 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.026 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.027 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.028 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.029 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.030 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.031 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.032 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.033 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.034 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.035 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.036 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.036 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.036 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.036 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.036 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.037 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.037 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.037 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.037 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.037 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.038 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.039 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.040 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.041 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.042 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.043 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.044 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.045 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.046 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.047 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.048 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.049 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.050 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.051 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.052 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.053 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.054 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.055 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.056 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.057 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.058 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.059 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.060 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.061 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.062 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.063 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.064 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.065 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.066 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.067 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.068 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.069 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.070 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.071 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.072 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.073 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.074 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.075 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.076 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.077 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.078 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.079 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.080 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.081 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 WARNING oslo_config.cfg [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 25 13:44:34 np0005535656 nova_compute[186233]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 25 13:44:34 np0005535656 nova_compute[186233]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 25 13:44:34 np0005535656 nova_compute[186233]: and ``live_migration_inbound_addr`` respectively.
Nov 25 13:44:34 np0005535656 nova_compute[186233]: ).  Its value may be silently ignored in the future.#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.082 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.083 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.084 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.085 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.086 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.087 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.088 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.089 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.090 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.091 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.092 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.093 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.094 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.095 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.096 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.097 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.098 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.099 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.100 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.101 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.102 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.103 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.104 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.105 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.105 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.105 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.105 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.105 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.106 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.106 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.106 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.106 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.106 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.107 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.107 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.107 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.107 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.107 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.108 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.108 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.108 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.108 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.108 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.109 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.109 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.109 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.109 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.109 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.110 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.111 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.111 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.111 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.111 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.112 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.112 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.112 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.112 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.112 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.113 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.114 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.115 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.116 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.117 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.118 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.119 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.120 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.121 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.122 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.123 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.124 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.125 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.126 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.127 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.127 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.127 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.127 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.127 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.128 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.129 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.129 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.129 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.129 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.129 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.130 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.130 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.130 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.130 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.130 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.131 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.132 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.133 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.134 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.135 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.136 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.137 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.137 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.137 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.137 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.137 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.138 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.139 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.140 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.141 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.142 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.143 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.144 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.145 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.146 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.147 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.148 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.149 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.150 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.150 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.150 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.150 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.150 186237 DEBUG oslo_service.service [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.152 186237 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.237 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.238 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.238 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.238 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 25 13:44:34 np0005535656 python3.9[186743]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:44:34 np0005535656 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 13:44:34 np0005535656 systemd[1]: Started libvirt QEMU daemon.
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.323 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f68d4b12550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.328 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f68d4b12550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.329 186237 INFO nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.347 186237 WARNING nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 25 13:44:34 np0005535656 nova_compute[186233]: 2025-11-25 18:44:34.348 186237 DEBUG nova.virt.libvirt.volume.mount [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.339 186237 INFO nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt host capabilities <capabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <host>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <uuid>f1b97441-74f2-4cd5-987c-5e759ff70e72</uuid>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <arch>x86_64</arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model>EPYC-Rome-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <vendor>AMD</vendor>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <microcode version='16777317'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <signature family='23' model='49' stepping='0'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='x2apic'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='tsc-deadline'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='osxsave'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='hypervisor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='tsc_adjust'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='spec-ctrl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='stibp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='arch-capabilities'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='cmp_legacy'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='topoext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='virt-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='lbrv'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='tsc-scale'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='vmcb-clean'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='pause-filter'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='pfthreshold'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='svme-addr-chk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='rdctl-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='skip-l1dfl-vmentry'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='mds-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature name='pschange-mc-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <pages unit='KiB' size='4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <pages unit='KiB' size='2048'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <pages unit='KiB' size='1048576'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <power_management>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <suspend_mem/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <suspend_disk/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <suspend_hybrid/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </power_management>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <iommu support='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <migration_features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <live/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <uri_transports>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <uri_transport>tcp</uri_transport>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <uri_transport>rdma</uri_transport>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </uri_transports>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </migration_features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <topology>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <cells num='1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <cell id='0'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <memory unit='KiB'>7864324</memory>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <pages unit='KiB' size='4'>1966081</pages>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <pages unit='KiB' size='2048'>0</pages>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <distances>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <sibling id='0' value='10'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          </distances>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          <cpus num='8'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:          </cpus>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        </cell>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </cells>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </topology>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <cache>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </cache>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <secmodel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model>selinux</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <doi>0</doi>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </secmodel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <secmodel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model>dac</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <doi>0</doi>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </secmodel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </host>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <guest>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <os_type>hvm</os_type>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <arch name='i686'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <wordsize>32</wordsize>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <domain type='qemu'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <domain type='kvm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <pae/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <nonpae/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <acpi default='on' toggle='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <apic default='on' toggle='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <cpuselection/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <deviceboot/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <disksnapshot default='on' toggle='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <externalSnapshot/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </guest>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <guest>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <os_type>hvm</os_type>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <arch name='x86_64'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <wordsize>64</wordsize>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <domain type='qemu'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <domain type='kvm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <acpi default='on' toggle='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <apic default='on' toggle='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <cpuselection/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <deviceboot/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <disksnapshot default='on' toggle='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <externalSnapshot/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </guest>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </capabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: #033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.354 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.383 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 25 13:44:35 np0005535656 nova_compute[186233]: <domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <domain>kvm</domain>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <arch>i686</arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <vcpu max='4096'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <iothreads supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <os supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='firmware'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <loader supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>rom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pflash</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='readonly'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>yes</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='secure'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </loader>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </os>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='maximumMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <vendor>AMD</vendor>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='succor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='custom' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-128'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-256'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-512'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <memoryBacking supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='sourceType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>anonymous</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>memfd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </memoryBacking>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <disk supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='diskDevice'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>disk</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cdrom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>floppy</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>lun</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>fdc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>sata</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </disk>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <graphics supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vnc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egl-headless</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </graphics>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <video supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='modelType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vga</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cirrus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>none</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>bochs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ramfb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </video>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hostdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='mode'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>subsystem</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='startupPolicy'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>mandatory</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>requisite</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>optional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='subsysType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pci</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='capsType'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='pciBackend'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hostdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <rng supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>random</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </rng>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <filesystem supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='driverType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>path</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>handle</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtiofs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </filesystem>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <tpm supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-tis</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-crb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emulator</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>external</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendVersion'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>2.0</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </tpm>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <redirdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </redirdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <channel supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </channel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <crypto supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </crypto>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <interface supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>passt</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </interface>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <panic supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>isa</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>hyperv</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </panic>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <console supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>null</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dev</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pipe</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stdio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>udp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tcp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu-vdagent</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </console>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <gic supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <genid supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backup supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <async-teardown supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <ps2 supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sev supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sgx supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hyperv supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='features'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>relaxed</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vapic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>spinlocks</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vpindex</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>runtime</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>synic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stimer</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reset</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vendor_id</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>frequencies</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reenlightenment</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tlbflush</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ipi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>avic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emsr_bitmap</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>xmm_input</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hyperv>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <launchSecurity supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='sectype'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tdx</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </launchSecurity>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.393 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 25 13:44:35 np0005535656 nova_compute[186233]: <domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <domain>kvm</domain>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <arch>i686</arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <vcpu max='240'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <iothreads supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <os supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='firmware'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <loader supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>rom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pflash</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='readonly'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>yes</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='secure'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </loader>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </os>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='maximumMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <vendor>AMD</vendor>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='succor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='custom' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 python3.9[186955]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-128'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-256'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-512'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <memoryBacking supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='sourceType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>anonymous</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>memfd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </memoryBacking>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <disk supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='diskDevice'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>disk</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cdrom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>floppy</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>lun</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ide</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>fdc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>sata</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </disk>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <graphics supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vnc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egl-headless</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </graphics>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <video supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='modelType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vga</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cirrus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>none</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>bochs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ramfb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </video>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hostdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='mode'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>subsystem</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='startupPolicy'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>mandatory</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>requisite</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>optional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='subsysType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pci</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='capsType'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='pciBackend'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hostdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <rng supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>random</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </rng>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <filesystem supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='driverType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>path</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>handle</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtiofs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </filesystem>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <tpm supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-tis</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-crb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emulator</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>external</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendVersion'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>2.0</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </tpm>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <redirdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </redirdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <channel supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </channel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <crypto supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </crypto>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <interface supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>passt</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </interface>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <panic supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>isa</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>hyperv</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </panic>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <console supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>null</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dev</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pipe</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stdio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>udp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tcp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu-vdagent</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </console>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <gic supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <genid supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backup supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <async-teardown supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <ps2 supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sev supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sgx supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hyperv supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='features'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>relaxed</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vapic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>spinlocks</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vpindex</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>runtime</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>synic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stimer</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reset</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vendor_id</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>frequencies</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reenlightenment</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tlbflush</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ipi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>avic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emsr_bitmap</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>xmm_input</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hyperv>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <launchSecurity supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='sectype'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tdx</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </launchSecurity>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.423 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.430 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 25 13:44:35 np0005535656 nova_compute[186233]: <domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <domain>kvm</domain>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <arch>x86_64</arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <vcpu max='4096'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <iothreads supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <os supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='firmware'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>efi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <loader supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>rom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pflash</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='readonly'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>yes</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='secure'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>yes</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </loader>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </os>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='maximumMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <vendor>AMD</vendor>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='succor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='custom' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-128'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-256'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-512'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <memoryBacking supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='sourceType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>anonymous</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>memfd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </memoryBacking>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <disk supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='diskDevice'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>disk</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cdrom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>floppy</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>lun</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>fdc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>sata</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </disk>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <graphics supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vnc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egl-headless</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </graphics>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <video supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='modelType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vga</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cirrus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>none</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>bochs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ramfb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </video>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hostdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='mode'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>subsystem</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='startupPolicy'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>mandatory</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>requisite</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>optional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='subsysType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pci</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='capsType'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='pciBackend'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hostdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <rng supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>random</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </rng>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <filesystem supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='driverType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>path</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>handle</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtiofs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </filesystem>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <tpm supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-tis</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-crb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emulator</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>external</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendVersion'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>2.0</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </tpm>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <redirdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </redirdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <channel supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </channel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <crypto supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </crypto>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <interface supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>passt</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </interface>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <panic supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>isa</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>hyperv</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </panic>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <console supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>null</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dev</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pipe</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stdio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>udp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tcp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu-vdagent</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </console>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <gic supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <genid supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backup supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <async-teardown supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <ps2 supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sev supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sgx supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hyperv supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='features'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>relaxed</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vapic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>spinlocks</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vpindex</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>runtime</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>synic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stimer</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reset</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vendor_id</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>frequencies</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reenlightenment</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tlbflush</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ipi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>avic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emsr_bitmap</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>xmm_input</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hyperv>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <launchSecurity supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='sectype'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tdx</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </launchSecurity>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.499 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 25 13:44:35 np0005535656 nova_compute[186233]: <domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <domain>kvm</domain>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <arch>x86_64</arch>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <vcpu max='240'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <iothreads supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <os supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='firmware'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <loader supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>rom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pflash</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='readonly'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>yes</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='secure'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>no</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </loader>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </os>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='maximumMigratable'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>on</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>off</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <vendor>AMD</vendor>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='succor'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <mode name='custom' supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Denverton-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='auto-ibrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amd-psfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='stibp-always-on'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='EPYC-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-128'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-256'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx10-512'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='prefetchiti'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Haswell-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512er'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512pf'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fma4'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tbm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xop'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='amx-tile'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-bf16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-fp16'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bitalg'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrc'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fzrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='la57'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='taa-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xfd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ifma'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cmpccxadd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fbsdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='fsrs'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ibrs-all'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mcdt-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pbrsb-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='psdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='serialize'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vaes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='hle'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='rtm'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512bw'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512cd'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512dq'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512f'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='avx512vl'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='invpcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pcid'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='pku'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='mpx'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='core-capability'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='split-lock-detect'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='cldemote'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='erms'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='gfni'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdir64b'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='movdiri'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='xsaves'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='athlon-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='core2duo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='coreduo-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='n270-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='ss'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <blockers model='phenom-v1'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnow'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <feature name='3dnowext'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </blockers>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </mode>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <memoryBacking supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <enum name='sourceType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>anonymous</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <value>memfd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </memoryBacking>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <disk supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='diskDevice'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>disk</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cdrom</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>floppy</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>lun</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ide</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>fdc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>sata</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </disk>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <graphics supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vnc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egl-headless</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </graphics>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <video supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='modelType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vga</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>cirrus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>none</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>bochs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ramfb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </video>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hostdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='mode'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>subsystem</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='startupPolicy'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>mandatory</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>requisite</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>optional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='subsysType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pci</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>scsi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='capsType'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='pciBackend'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hostdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <rng supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtio-non-transitional</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>random</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>egd</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </rng>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <filesystem supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='driverType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>path</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>handle</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>virtiofs</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </filesystem>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <tpm supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-tis</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tpm-crb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emulator</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>external</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendVersion'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>2.0</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </tpm>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <redirdev supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='bus'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>usb</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </redirdev>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <channel supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </channel>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <crypto supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendModel'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>builtin</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </crypto>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <interface supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='backendType'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>default</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>passt</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </interface>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <panic supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='model'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>isa</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>hyperv</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </panic>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <console supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='type'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>null</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vc</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pty</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dev</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>file</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>pipe</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stdio</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>udp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tcp</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>unix</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>qemu-vdagent</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>dbus</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </console>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </devices>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <gic supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <genid supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <backup supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <async-teardown supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <ps2 supported='yes'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sev supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <sgx supported='no'/>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <hyperv supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='features'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>relaxed</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vapic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>spinlocks</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vpindex</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>runtime</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>synic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>stimer</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reset</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>vendor_id</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>frequencies</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>reenlightenment</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tlbflush</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>ipi</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>avic</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>emsr_bitmap</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>xmm_input</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </defaults>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </hyperv>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    <launchSecurity supported='yes'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      <enum name='sectype'>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:        <value>tdx</value>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:      </enum>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:    </launchSecurity>
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  </features>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </domainCapabilities>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.557 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.558 186237 INFO nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Secure Boot support detected#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.560 186237 INFO nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.560 186237 INFO nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.575 186237 DEBUG nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 25 13:44:35 np0005535656 nova_compute[186233]:  <model>Nehalem</model>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: </cpu>
Nov 25 13:44:35 np0005535656 nova_compute[186233]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.579 186237 DEBUG nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.665 186237 INFO nova.virt.node [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Determined node identity 752b63a7-2ce2-4d83-a281-12c9803714ea from /var/lib/nova/compute_id#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.700 186237 WARNING nova.compute.manager [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Compute nodes ['752b63a7-2ce2-4d83-a281-12c9803714ea'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.729 186237 INFO nova.compute.manager [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.761 186237 WARNING nova.compute.manager [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.761 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.761 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.761 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:44:35 np0005535656 nova_compute[186233]: 2025-11-25 18:44:35.762 186237 DEBUG nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:44:35 np0005535656 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 13:44:35 np0005535656 systemd[1]: Started libvirt nodedev daemon.
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.109 186237 WARNING nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.111 186237 DEBUG nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6180MB free_disk=73.3676528930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.111 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.112 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.130 186237 WARNING nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] No compute node record for compute-1.ctlplane.example.com:752b63a7-2ce2-4d83-a281-12c9803714ea: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 752b63a7-2ce2-4d83-a281-12c9803714ea could not be found.#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.149 186237 INFO nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 752b63a7-2ce2-4d83-a281-12c9803714ea#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.258 186237 DEBUG nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:44:36 np0005535656 nova_compute[186233]: 2025-11-25 18:44:36.258 186237 DEBUG nova.compute.resource_tracker [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:44:37 np0005535656 python3.9[187155]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:44:37 np0005535656 systemd[1]: Stopping nova_compute container...
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.364 186237 INFO nova.scheduler.client.report [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] [req-aae31d6e-b5f4-416f-83ac-9ea8bc515c32] Created resource provider record via placement API for resource provider with UUID 752b63a7-2ce2-4d83-a281-12c9803714ea and name compute-1.ctlplane.example.com.#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.491 186237 DEBUG nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 25 13:44:37 np0005535656 nova_compute[186233]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.491 186237 INFO nova.virt.libvirt.host [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.492 186237 DEBUG nova.compute.provider_tree [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.492 186237 DEBUG nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.494 186237 DEBUG nova.virt.libvirt.driver [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Libvirt baseline CPU <cpu>
Nov 25 13:44:37 np0005535656 nova_compute[186233]:  <arch>x86_64</arch>
Nov 25 13:44:37 np0005535656 nova_compute[186233]:  <model>Nehalem</model>
Nov 25 13:44:37 np0005535656 nova_compute[186233]:  <vendor>AMD</vendor>
Nov 25 13:44:37 np0005535656 nova_compute[186233]:  <topology sockets="8" cores="1" threads="1"/>
Nov 25 13:44:37 np0005535656 nova_compute[186233]: </cpu>
Nov 25 13:44:37 np0005535656 nova_compute[186233]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.531 186237 DEBUG oslo_concurrency.lockutils [None req-f564e043-8e1b-462a-8539-ab9bdcff6829 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.532 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.532 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:44:37 np0005535656 nova_compute[186233]: 2025-11-25 18:44:37.533 186237 DEBUG oslo_concurrency.lockutils [None req-31844abd-4f3c-4d52-a3f3-c0ef54b3c6c0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:44:38 np0005535656 virtqemud[186765]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 25 13:44:38 np0005535656 virtqemud[186765]: hostname: compute-1
Nov 25 13:44:38 np0005535656 virtqemud[186765]: End of file while reading data: Input/output error
Nov 25 13:44:38 np0005535656 systemd[1]: libpod-e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3.scope: Deactivated successfully.
Nov 25 13:44:38 np0005535656 podman[187159]: 2025-11-25 18:44:38.1282144 +0000 UTC m=+0.953309250 container died e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 25 13:44:38 np0005535656 systemd[1]: libpod-e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3.scope: Consumed 3.518s CPU time.
Nov 25 13:44:38 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3-userdata-shm.mount: Deactivated successfully.
Nov 25 13:44:38 np0005535656 systemd[1]: var-lib-containers-storage-overlay-f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db-merged.mount: Deactivated successfully.
Nov 25 13:44:38 np0005535656 podman[187159]: 2025-11-25 18:44:38.206033696 +0000 UTC m=+1.031128526 container cleanup e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:44:38 np0005535656 podman[187159]: nova_compute
Nov 25 13:44:38 np0005535656 podman[187190]: nova_compute
Nov 25 13:44:38 np0005535656 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 25 13:44:38 np0005535656 systemd[1]: Stopped nova_compute container.
Nov 25 13:44:38 np0005535656 systemd[1]: Starting nova_compute container...
Nov 25 13:44:38 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:44:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:38 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f55e8bc38227585f9ea7b084d14b6225be7638caa20021a3e48cff83a203a6db/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:38 np0005535656 podman[187203]: 2025-11-25 18:44:38.442524218 +0000 UTC m=+0.105965707 container init e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 25 13:44:38 np0005535656 podman[187203]: 2025-11-25 18:44:38.448935814 +0000 UTC m=+0.112377283 container start e43444f89e39e128d4409a1827263aab2f317fef404e25b44e852e58ce0d26a3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + sudo -E kolla_set_configs
Nov 25 13:44:38 np0005535656 podman[187203]: nova_compute
Nov 25 13:44:38 np0005535656 systemd[1]: Started nova_compute container.
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Validating config file
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying service configuration files
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /etc/ceph
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Creating directory /etc/ceph
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Writing out command to execute
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:38 np0005535656 nova_compute[187219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 13:44:38 np0005535656 nova_compute[187219]: ++ cat /run_command
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + CMD=nova-compute
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + ARGS=
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + sudo kolla_copy_cacerts
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + [[ ! -n '' ]]
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + . kolla_extend_start
Nov 25 13:44:38 np0005535656 nova_compute[187219]: Running command: 'nova-compute'
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + umask 0022
Nov 25 13:44:38 np0005535656 nova_compute[187219]: + exec nova-compute
Nov 25 13:44:39 np0005535656 python3.9[187382]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 13:44:39 np0005535656 systemd[1]: Started libpod-conmon-0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4.scope.
Nov 25 13:44:39 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:44:39 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82b885904560d3d7229a0b03526ac80a0df9d547d2eeec9858efcb149d7d39c0/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:39 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82b885904560d3d7229a0b03526ac80a0df9d547d2eeec9858efcb149d7d39c0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:39 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82b885904560d3d7229a0b03526ac80a0df9d547d2eeec9858efcb149d7d39c0/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 25 13:44:39 np0005535656 podman[187408]: 2025-11-25 18:44:39.768250803 +0000 UTC m=+0.182850187 container init 0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 13:44:39 np0005535656 podman[187408]: 2025-11-25 18:44:39.781932478 +0000 UTC m=+0.196531842 container start 0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 13:44:39 np0005535656 python3.9[187382]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Applying nova statedir ownership
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 25 13:44:39 np0005535656 nova_compute_init[187430]: INFO:nova_statedir:Nova statedir ownership complete
Nov 25 13:44:39 np0005535656 systemd[1]: libpod-0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4.scope: Deactivated successfully.
Nov 25 13:44:39 np0005535656 podman[187431]: 2025-11-25 18:44:39.872552953 +0000 UTC m=+0.047863279 container died 0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init)
Nov 25 13:44:39 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4-userdata-shm.mount: Deactivated successfully.
Nov 25 13:44:39 np0005535656 systemd[1]: var-lib-containers-storage-overlay-82b885904560d3d7229a0b03526ac80a0df9d547d2eeec9858efcb149d7d39c0-merged.mount: Deactivated successfully.
Nov 25 13:44:39 np0005535656 podman[187442]: 2025-11-25 18:44:39.927906786 +0000 UTC m=+0.063707591 container cleanup 0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 25 13:44:39 np0005535656 systemd[1]: libpod-conmon-0bbd074b8812f8ef27c2fcf1e829e82bbfee69967e9e68ec2cef906e0d9546d4.scope: Deactivated successfully.
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.477 187223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.477 187223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.478 187223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.478 187223 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 25 13:44:40 np0005535656 systemd[1]: session-25.scope: Deactivated successfully.
Nov 25 13:44:40 np0005535656 systemd[1]: session-25.scope: Consumed 2min 15.401s CPU time.
Nov 25 13:44:40 np0005535656 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Nov 25 13:44:40 np0005535656 systemd-logind[788]: Removed session 25.
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.634 187223 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.662 187223 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:44:40 np0005535656 nova_compute[187219]: 2025-11-25 18:44:40.663 187223 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.130 187223 INFO nova.virt.driver [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.263 187223 INFO nova.compute.provider_config [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.272 187223 DEBUG oslo_concurrency.lockutils [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.272 187223 DEBUG oslo_concurrency.lockutils [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.273 187223 DEBUG oslo_concurrency.lockutils [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.273 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.273 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.273 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.274 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.275 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.276 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.277 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.277 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.277 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.277 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.277 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.278 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.279 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.280 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.281 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.282 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.283 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.284 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.285 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.286 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.287 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.288 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.289 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.289 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.289 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.289 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.289 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.290 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.290 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.290 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.290 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.290 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.291 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.292 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.293 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.294 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.295 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.296 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.297 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.297 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.297 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.297 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.297 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.298 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.299 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.300 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.301 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.302 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.303 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.304 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.305 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.306 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.307 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.308 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.309 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.310 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.311 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.312 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.313 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.314 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.315 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.316 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.317 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.318 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.319 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.320 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.321 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.322 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.323 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.324 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.325 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.326 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.327 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.327 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.327 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.327 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.327 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.328 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.329 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.330 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.331 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.332 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.333 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.334 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.335 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.335 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.335 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.335 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.335 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.336 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.337 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.338 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.339 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.340 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.341 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.342 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.343 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.344 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.345 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.346 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.347 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.348 187223 WARNING oslo_config.cfg [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 25 13:44:41 np0005535656 nova_compute[187219]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 25 13:44:41 np0005535656 nova_compute[187219]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 25 13:44:41 np0005535656 nova_compute[187219]: and ``live_migration_inbound_addr`` respectively.
Nov 25 13:44:41 np0005535656 nova_compute[187219]: ).  Its value may be silently ignored in the future.#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.348 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.348 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.348 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.349 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.350 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.351 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.352 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.353 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.354 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.355 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.355 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.355 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.355 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.355 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.356 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.357 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.358 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.359 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.360 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.361 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.362 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.363 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.364 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.365 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.366 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.367 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.368 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.369 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.370 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.371 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.372 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.373 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.374 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.375 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.376 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.377 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.378 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.379 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.380 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.381 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.382 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.383 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.384 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.385 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.385 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.385 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.385 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.385 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.386 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.387 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.388 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.389 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.390 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.391 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.392 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.393 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.394 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.395 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.396 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.397 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.398 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.399 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.400 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.401 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.402 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.403 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.404 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.405 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.406 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.407 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.408 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.408 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.408 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.408 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.408 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.409 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.410 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.411 187223 DEBUG oslo_service.service [None req-f76a2d73-ba85-4271-999c-4ab8f8c6c302 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.412 187223 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.428 187223 INFO nova.virt.node [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Determined node identity 752b63a7-2ce2-4d83-a281-12c9803714ea from /var/lib/nova/compute_id#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.428 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.429 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.429 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.429 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.444 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb66174d790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.446 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb66174d790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.447 187223 INFO nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.454 187223 INFO nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt host capabilities <capabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <host>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <uuid>f1b97441-74f2-4cd5-987c-5e759ff70e72</uuid>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <arch>x86_64</arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model>EPYC-Rome-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <vendor>AMD</vendor>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <microcode version='16777317'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <signature family='23' model='49' stepping='0'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='x2apic'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='tsc-deadline'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='osxsave'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='hypervisor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='tsc_adjust'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='spec-ctrl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='stibp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='arch-capabilities'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='cmp_legacy'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='topoext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='virt-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='lbrv'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='tsc-scale'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='vmcb-clean'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='pause-filter'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='pfthreshold'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='svme-addr-chk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='rdctl-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='skip-l1dfl-vmentry'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='mds-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature name='pschange-mc-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <pages unit='KiB' size='4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <pages unit='KiB' size='2048'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <pages unit='KiB' size='1048576'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <power_management>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <suspend_mem/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <suspend_disk/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <suspend_hybrid/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </power_management>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <iommu support='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <migration_features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <live/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <uri_transports>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <uri_transport>tcp</uri_transport>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <uri_transport>rdma</uri_transport>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </uri_transports>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </migration_features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <topology>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <cells num='1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <cell id='0'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <memory unit='KiB'>7864324</memory>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <pages unit='KiB' size='4'>1966081</pages>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <pages unit='KiB' size='2048'>0</pages>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <distances>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <sibling id='0' value='10'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          </distances>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          <cpus num='8'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:          </cpus>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        </cell>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </cells>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </topology>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <cache>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </cache>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <secmodel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model>selinux</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <doi>0</doi>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </secmodel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <secmodel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model>dac</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <doi>0</doi>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </secmodel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </host>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <guest>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <os_type>hvm</os_type>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <arch name='i686'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <wordsize>32</wordsize>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <domain type='qemu'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <domain type='kvm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <pae/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <nonpae/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <acpi default='on' toggle='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <apic default='on' toggle='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <cpuselection/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <deviceboot/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <disksnapshot default='on' toggle='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <externalSnapshot/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </guest>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <guest>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <os_type>hvm</os_type>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <arch name='x86_64'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <wordsize>64</wordsize>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <domain type='qemu'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <domain type='kvm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <acpi default='on' toggle='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <apic default='on' toggle='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <cpuselection/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <deviceboot/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <disksnapshot default='on' toggle='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <externalSnapshot/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </guest>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </capabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: #033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.462 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.465 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 25 13:44:41 np0005535656 nova_compute[187219]: <domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <domain>kvm</domain>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <arch>i686</arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <vcpu max='240'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <iothreads supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <os supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='firmware'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <loader supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>rom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pflash</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='readonly'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>yes</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='secure'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </loader>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='maximumMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <vendor>AMD</vendor>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='succor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='custom' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-128'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-256'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-512'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <memoryBacking supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='sourceType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>anonymous</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>memfd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </memoryBacking>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <disk supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='diskDevice'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>disk</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cdrom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>floppy</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>lun</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ide</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>fdc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>sata</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <graphics supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vnc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egl-headless</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </graphics>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <video supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='modelType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vga</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cirrus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>none</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>bochs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ramfb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hostdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='mode'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>subsystem</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='startupPolicy'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>mandatory</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>requisite</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>optional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='subsysType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pci</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='capsType'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='pciBackend'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hostdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <rng supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>random</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <filesystem supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='driverType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>path</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>handle</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtiofs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </filesystem>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <tpm supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-tis</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-crb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emulator</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>external</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendVersion'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>2.0</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </tpm>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <redirdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </redirdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <channel supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </channel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <crypto supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </crypto>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <interface supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>passt</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <panic supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>isa</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>hyperv</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </panic>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <console supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>null</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dev</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pipe</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stdio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>udp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tcp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu-vdagent</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </console>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <gic supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <genid supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backup supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <async-teardown supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <ps2 supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sev supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sgx supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hyperv supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='features'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>relaxed</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vapic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>spinlocks</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vpindex</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>runtime</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>synic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stimer</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reset</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vendor_id</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>frequencies</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reenlightenment</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tlbflush</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ipi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>avic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emsr_bitmap</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>xmm_input</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hyperv>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <launchSecurity supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='sectype'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tdx</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </launchSecurity>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.472 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 25 13:44:41 np0005535656 nova_compute[187219]: <domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <domain>kvm</domain>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <arch>i686</arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <vcpu max='4096'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <iothreads supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <os supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='firmware'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <loader supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>rom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pflash</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='readonly'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>yes</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='secure'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </loader>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='maximumMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <vendor>AMD</vendor>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='succor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='custom' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-128'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-256'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-512'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <memoryBacking supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='sourceType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>anonymous</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>memfd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </memoryBacking>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <disk supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='diskDevice'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>disk</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cdrom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>floppy</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>lun</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>fdc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>sata</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <graphics supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vnc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egl-headless</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </graphics>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <video supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='modelType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vga</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cirrus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>none</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>bochs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ramfb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hostdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='mode'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>subsystem</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='startupPolicy'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>mandatory</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>requisite</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>optional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='subsysType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pci</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='capsType'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='pciBackend'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hostdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <rng supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>random</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <filesystem supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='driverType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>path</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>handle</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtiofs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </filesystem>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <tpm supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-tis</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-crb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emulator</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>external</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendVersion'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>2.0</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </tpm>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <redirdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </redirdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <channel supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </channel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <crypto supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </crypto>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <interface supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>passt</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <panic supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>isa</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>hyperv</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </panic>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <console supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>null</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dev</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pipe</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stdio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>udp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tcp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu-vdagent</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </console>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <gic supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <genid supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backup supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <async-teardown supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <ps2 supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sev supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sgx supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hyperv supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='features'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>relaxed</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vapic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>spinlocks</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vpindex</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>runtime</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>synic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stimer</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reset</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vendor_id</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>frequencies</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reenlightenment</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tlbflush</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ipi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>avic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emsr_bitmap</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>xmm_input</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hyperv>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <launchSecurity supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='sectype'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tdx</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </launchSecurity>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.514 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.516 187223 DEBUG nova.virt.libvirt.volume.mount [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.520 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 25 13:44:41 np0005535656 nova_compute[187219]: <domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <domain>kvm</domain>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <arch>x86_64</arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <vcpu max='240'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <iothreads supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <os supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='firmware'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <loader supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>rom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pflash</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='readonly'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>yes</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='secure'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </loader>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='maximumMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <vendor>AMD</vendor>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='succor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='custom' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-128'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-256'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-512'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <memoryBacking supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='sourceType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>anonymous</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>memfd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </memoryBacking>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <disk supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='diskDevice'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>disk</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cdrom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>floppy</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>lun</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ide</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>fdc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>sata</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <graphics supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vnc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egl-headless</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </graphics>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <video supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='modelType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vga</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cirrus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>none</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>bochs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ramfb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hostdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='mode'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>subsystem</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='startupPolicy'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>mandatory</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>requisite</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>optional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='subsysType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pci</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='capsType'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='pciBackend'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hostdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <rng supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>random</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <filesystem supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='driverType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>path</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>handle</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtiofs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </filesystem>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <tpm supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-tis</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-crb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emulator</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>external</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendVersion'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>2.0</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </tpm>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <redirdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </redirdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <channel supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </channel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <crypto supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </crypto>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <interface supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>passt</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <panic supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>isa</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>hyperv</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </panic>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <console supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>null</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dev</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pipe</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stdio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>udp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tcp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu-vdagent</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </console>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <gic supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <genid supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backup supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <async-teardown supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <ps2 supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sev supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sgx supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hyperv supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='features'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>relaxed</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vapic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>spinlocks</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vpindex</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>runtime</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>synic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stimer</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reset</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vendor_id</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>frequencies</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reenlightenment</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tlbflush</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ipi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>avic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emsr_bitmap</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>xmm_input</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hyperv>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <launchSecurity supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='sectype'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tdx</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </launchSecurity>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.597 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 25 13:44:41 np0005535656 nova_compute[187219]: <domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <path>/usr/libexec/qemu-kvm</path>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <domain>kvm</domain>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <arch>x86_64</arch>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <vcpu max='4096'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <iothreads supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <os supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='firmware'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>efi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <loader supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>rom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pflash</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='readonly'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>yes</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='secure'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>yes</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>no</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </loader>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-passthrough' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='hostPassthroughMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='maximum' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='maximumMigratable'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>on</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>off</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='host-model' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <vendor>AMD</vendor>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='x2apic'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-deadline'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='hypervisor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc_adjust'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='spec-ctrl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='stibp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='cmp_legacy'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='overflow-recov'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='succor'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='amd-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='virt-ssbd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lbrv'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='tsc-scale'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='vmcb-clean'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='flushbyasid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pause-filter'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='pfthreshold'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='svme-addr-chk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <feature policy='disable' name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <mode name='custom' supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Broadwell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cascadelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Cooperlake-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Denverton-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Dhyana-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Genoa-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='auto-ibrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Milan-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amd-psfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='no-nested-data-bp'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='null-sel-clr-base'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='stibp-always-on'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-Rome-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='EPYC-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='GraniteRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-128'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-256'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx10-512'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='prefetchiti'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Haswell-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-noTSX'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v6'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Icelake-Server-v7'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='IvyBridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='KnightsMill-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4fmaps'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-4vnniw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512er'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512pf'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G4-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Opteron_G5-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fma4'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tbm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xop'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SapphireRapids-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='amx-tile'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-bf16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-fp16'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512-vpopcntdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bitalg'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vbmi2'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrc'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fzrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='la57'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='taa-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='tsx-ldtrk'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xfd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='SierraForest-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ifma'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-ne-convert'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx-vnni-int8'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='bus-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cmpccxadd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fbsdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='fsrs'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ibrs-all'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mcdt-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pbrsb-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='psdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='sbdr-ssdp-no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='serialize'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vaes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='vpclmulqdq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Client-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='hle'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='rtm'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Skylake-Server-v5'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512bw'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512cd'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512dq'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512f'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='avx512vl'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='invpcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pcid'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='pku'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='mpx'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v2'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v3'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='core-capability'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='split-lock-detect'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='Snowridge-v4'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='cldemote'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='erms'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='gfni'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdir64b'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='movdiri'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='xsaves'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='athlon-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='core2duo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='coreduo-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='n270-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='ss'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <blockers model='phenom-v1'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnow'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <feature name='3dnowext'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </blockers>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </mode>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <memoryBacking supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <enum name='sourceType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>anonymous</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <value>memfd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </memoryBacking>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <disk supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='diskDevice'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>disk</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cdrom</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>floppy</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>lun</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>fdc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>sata</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <graphics supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vnc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egl-headless</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </graphics>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <video supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='modelType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vga</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>cirrus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>none</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>bochs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ramfb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hostdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='mode'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>subsystem</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='startupPolicy'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>mandatory</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>requisite</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>optional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='subsysType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pci</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>scsi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='capsType'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='pciBackend'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hostdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <rng supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtio-non-transitional</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>random</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>egd</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <filesystem supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='driverType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>path</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>handle</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>virtiofs</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </filesystem>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <tpm supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-tis</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tpm-crb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emulator</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>external</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendVersion'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>2.0</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </tpm>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <redirdev supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='bus'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>usb</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </redirdev>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <channel supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </channel>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <crypto supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendModel'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>builtin</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </crypto>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <interface supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='backendType'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>default</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>passt</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <panic supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='model'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>isa</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>hyperv</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </panic>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <console supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='type'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>null</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vc</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pty</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dev</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>file</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>pipe</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stdio</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>udp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tcp</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>unix</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>qemu-vdagent</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>dbus</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </console>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <gic supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <vmcoreinfo supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <genid supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backingStoreInput supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <backup supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <async-teardown supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <ps2 supported='yes'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sev supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <sgx supported='no'/>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <hyperv supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='features'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>relaxed</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vapic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>spinlocks</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vpindex</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>runtime</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>synic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>stimer</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reset</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>vendor_id</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>frequencies</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>reenlightenment</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tlbflush</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>ipi</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>avic</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>emsr_bitmap</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>xmm_input</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <spinlocks>4095</spinlocks>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <stimer_direct>on</stimer_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_direct>on</tlbflush_direct>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <tlbflush_extended>on</tlbflush_extended>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </defaults>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </hyperv>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    <launchSecurity supported='yes'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      <enum name='sectype'>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:        <value>tdx</value>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:      </enum>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:    </launchSecurity>
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </domainCapabilities>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.679 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.680 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.680 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.680 187223 INFO nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Secure Boot support detected#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.683 187223 INFO nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.683 187223 INFO nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.697 187223 DEBUG nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 25 13:44:41 np0005535656 nova_compute[187219]:  <model>Nehalem</model>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: </cpu>
Nov 25 13:44:41 np0005535656 nova_compute[187219]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.701 187223 DEBUG nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.751 187223 INFO nova.virt.node [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Determined node identity 752b63a7-2ce2-4d83-a281-12c9803714ea from /var/lib/nova/compute_id#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.847 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Verified node 752b63a7-2ce2-4d83-a281-12c9803714ea matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.892 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.994 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.995 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.995 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:44:41 np0005535656 nova_compute[187219]: 2025-11-25 18:44:41.995 187223 DEBUG nova.compute.resource_tracker [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.173 187223 WARNING nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.174 187223 DEBUG nova.compute.resource_tracker [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6131MB free_disk=73.36636734008789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.174 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.174 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.331 187223 DEBUG nova.compute.resource_tracker [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.331 187223 DEBUG nova.compute.resource_tracker [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.400 187223 DEBUG nova.scheduler.client.report [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.438 187223 DEBUG nova.scheduler.client.report [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.438 187223 DEBUG nova.compute.provider_tree [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.465 187223 DEBUG nova.scheduler.client.report [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.478 187223 DEBUG nova.scheduler.client.report [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.514 187223 DEBUG nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 25 13:44:42 np0005535656 nova_compute[187219]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.514 187223 INFO nova.virt.libvirt.host [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.515 187223 DEBUG nova.compute.provider_tree [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.516 187223 DEBUG nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.519 187223 DEBUG nova.virt.libvirt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Libvirt baseline CPU <cpu>
Nov 25 13:44:42 np0005535656 nova_compute[187219]:  <arch>x86_64</arch>
Nov 25 13:44:42 np0005535656 nova_compute[187219]:  <model>Nehalem</model>
Nov 25 13:44:42 np0005535656 nova_compute[187219]:  <vendor>AMD</vendor>
Nov 25 13:44:42 np0005535656 nova_compute[187219]:  <topology sockets="8" cores="1" threads="1"/>
Nov 25 13:44:42 np0005535656 nova_compute[187219]: </cpu>
Nov 25 13:44:42 np0005535656 nova_compute[187219]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.580 187223 DEBUG nova.scheduler.client.report [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updated inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.581 187223 DEBUG nova.compute.provider_tree [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.581 187223 DEBUG nova.compute.provider_tree [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.678 187223 DEBUG nova.compute.provider_tree [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.710 187223 DEBUG nova.compute.resource_tracker [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.711 187223 DEBUG oslo_concurrency.lockutils [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.711 187223 DEBUG nova.service [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.785 187223 DEBUG nova.service [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 25 13:44:42 np0005535656 nova_compute[187219]: 2025-11-25 18:44:42.786 187223 DEBUG nova.servicegroup.drivers.db [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 25 13:44:44 np0005535656 podman[187519]: 2025-11-25 18:44:44.016905337 +0000 UTC m=+0.116653418 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:44:45 np0005535656 systemd-logind[788]: New session 27 of user zuul.
Nov 25 13:44:45 np0005535656 systemd[1]: Started Session 27 of User zuul.
Nov 25 13:44:47 np0005535656 python3.9[187694]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 13:44:48 np0005535656 python3.9[187850]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:44:49 np0005535656 systemd[1]: Reloading.
Nov 25 13:44:49 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:44:49 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:44:50 np0005535656 python3.9[188035]: ansible-ansible.builtin.service_facts Invoked
Nov 25 13:44:50 np0005535656 network[188052]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 13:44:50 np0005535656 network[188053]: 'network-scripts' will be removed from distribution in near future.
Nov 25 13:44:50 np0005535656 network[188054]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 13:44:55 np0005535656 nova_compute[187219]: 2025-11-25 18:44:55.788 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:44:55 np0005535656 nova_compute[187219]: 2025-11-25 18:44:55.810 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:44:56 np0005535656 python3.9[188328]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:44:58 np0005535656 python3.9[188481]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:44:58 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 13:44:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:44:59.053 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:44:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:44:59.055 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:44:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:44:59.055 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:44:59 np0005535656 python3.9[188634]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:00 np0005535656 python3.9[188786]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:45:01 np0005535656 python3.9[188938]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:45:02 np0005535656 python3.9[189090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:45:02 np0005535656 systemd[1]: Reloading.
Nov 25 13:45:02 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:45:02 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:45:03 np0005535656 podman[189248]: 2025-11-25 18:45:03.682807295 +0000 UTC m=+0.140107841 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 25 13:45:03 np0005535656 python3.9[189289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:45:04 np0005535656 podman[189427]: 2025-11-25 18:45:04.547093123 +0000 UTC m=+0.069492505 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 25 13:45:04 np0005535656 python3.9[189475]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:45:06 np0005535656 python3.9[189625]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:45:06 np0005535656 python3.9[189777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:07 np0005535656 python3.9[189898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096306.4593067-252-113031104211605/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:45:08 np0005535656 python3.9[190050]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 25 13:45:10 np0005535656 python3.9[190202]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 25 13:45:10 np0005535656 python3.9[190355]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 13:45:12 np0005535656 python3.9[190513]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 13:45:13 np0005535656 python3.9[190671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:14 np0005535656 podman[190766]: 2025-11-25 18:45:14.319982824 +0000 UTC m=+0.063405042 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd)
Nov 25 13:45:14 np0005535656 python3.9[190809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764096313.2816038-388-210439296788523/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:15 np0005535656 python3.9[190962]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:15 np0005535656 python3.9[191083]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764096314.679874-388-182307803403642/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:16 np0005535656 python3.9[191233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:17 np0005535656 python3.9[191354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764096316.0074089-388-14862546702073/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:18 np0005535656 python3.9[191504]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:45:19 np0005535656 python3.9[191656]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:45:20 np0005535656 python3.9[191808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:21 np0005535656 python3.9[191929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096320.1632555-506-215792567678899/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:22 np0005535656 python3.9[192079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:22 np0005535656 python3.9[192155]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:23 np0005535656 python3.9[192305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:24 np0005535656 python3.9[192426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096323.0217211-506-108266585314911/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:24 np0005535656 python3.9[192576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:25 np0005535656 python3.9[192697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096324.3753176-506-121940049782213/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:26 np0005535656 python3.9[192847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:27 np0005535656 python3.9[192968]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096325.8155448-506-97047628412949/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:27 np0005535656 python3.9[193120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:28 np0005535656 python3.9[193241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096327.2905042-506-125414292409641/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:29 np0005535656 python3.9[193391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:29 np0005535656 python3.9[193512]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096328.635015-506-83995982488705/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:30 np0005535656 python3.9[193662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:31 np0005535656 python3.9[193783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096330.0838423-506-215199220283858/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:32 np0005535656 python3.9[193933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:32 np0005535656 python3.9[194054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096331.4930625-506-56046850876001/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:33 np0005535656 python3.9[194204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:33 np0005535656 podman[194299]: 2025-11-25 18:45:33.918568269 +0000 UTC m=+0.107283576 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:45:34 np0005535656 python3.9[194340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096332.8731067-506-229095317400261/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:34 np0005535656 python3.9[194502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:34 np0005535656 podman[194538]: 2025-11-25 18:45:34.979277696 +0000 UTC m=+0.096856204 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 13:45:35 np0005535656 python3.9[194642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096334.239746-506-86374727302851/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:36 np0005535656 python3.9[194792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:36 np0005535656 python3.9[194868]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:37 np0005535656 python3.9[195018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:38 np0005535656 python3.9[195094]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:39 np0005535656 python3.9[195244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:39 np0005535656 python3.9[195320]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:40 np0005535656 python3.9[195472]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.674 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.675 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.675 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.675 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.690 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.692 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.692 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.693 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.694 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.718 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.719 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.719 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.720 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.977 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.979 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.3662338256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.979 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:45:40 np0005535656 nova_compute[187219]: 2025-11-25 18:45:40.979 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.065 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.066 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.092 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.117 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.119 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:45:41 np0005535656 nova_compute[187219]: 2025-11-25 18:45:41.120 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:45:41 np0005535656 python3.9[195624]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:42 np0005535656 python3.9[195776]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:45:43 np0005535656 python3.9[195928]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:45:43 np0005535656 systemd[1]: Reloading.
Nov 25 13:45:43 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:45:43 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:45:43 np0005535656 systemd[1]: Listening on Podman API Socket.
Nov 25 13:45:44 np0005535656 python3.9[196119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:45 np0005535656 podman[196210]: 2025-11-25 18:45:45.017327602 +0000 UTC m=+0.133616616 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:45:45 np0005535656 python3.9[196262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096343.9214997-950-195480929041903/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:45:46 np0005535656 python3.9[196415]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 25 13:45:47 np0005535656 python3.9[196567]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:45:48 np0005535656 python3[196719]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:45:50 np0005535656 podman[196733]: 2025-11-25 18:45:50.009900801 +0000 UTC m=+1.313187110 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 25 13:45:50 np0005535656 podman[196831]: 2025-11-25 18:45:50.251221802 +0000 UTC m=+0.062665182 container create 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 25 13:45:50 np0005535656 podman[196831]: 2025-11-25 18:45:50.219967128 +0000 UTC m=+0.031410538 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 25 13:45:50 np0005535656 python3[196719]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 25 13:45:51 np0005535656 python3.9[197020]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:45:52 np0005535656 python3.9[197174]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:53 np0005535656 python3.9[197325]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096352.4738317-1056-17496014778091/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:45:54 np0005535656 python3.9[197401]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:45:54 np0005535656 systemd[1]: Reloading.
Nov 25 13:45:54 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:45:54 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:45:55 np0005535656 python3.9[197513]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:45:55 np0005535656 systemd[1]: Reloading.
Nov 25 13:45:55 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:45:55 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:45:55 np0005535656 systemd[1]: Starting podman_exporter container...
Nov 25 13:45:55 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:45:55 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c8473bd77c330a646f784070da9ad42c5acd45bb6b16aa4b43712f880388ea/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 13:45:55 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c8473bd77c330a646f784070da9ad42c5acd45bb6b16aa4b43712f880388ea/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 13:45:55 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.
Nov 25 13:45:55 np0005535656 podman[197553]: 2025-11-25 18:45:55.917527268 +0000 UTC m=+0.170675656 container init 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 13:45:55 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:55.941Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 13:45:55 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:55.941Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 13:45:55 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:55.941Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 13:45:55 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:55.941Z caller=handler.go:105 level=info collector=container
Nov 25 13:45:55 np0005535656 podman[197553]: 2025-11-25 18:45:55.946279573 +0000 UTC m=+0.199427931 container start 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:45:55 np0005535656 podman[197553]: podman_exporter
Nov 25 13:45:55 np0005535656 systemd[1]: Starting Podman API Service...
Nov 25 13:45:55 np0005535656 systemd[1]: Started Podman API Service.
Nov 25 13:45:55 np0005535656 systemd[1]: Started podman_exporter container.
Nov 25 13:45:55 np0005535656 podman[197580]: time="2025-11-25T18:45:55Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 25 13:45:55 np0005535656 podman[197580]: time="2025-11-25T18:45:55Z" level=info msg="Setting parallel job count to 25"
Nov 25 13:45:55 np0005535656 podman[197580]: time="2025-11-25T18:45:55Z" level=info msg="Using sqlite as database backend"
Nov 25 13:45:56 np0005535656 podman[197580]: time="2025-11-25T18:45:56Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 25 13:45:56 np0005535656 podman[197580]: time="2025-11-25T18:45:56Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 25 13:45:56 np0005535656 podman[197580]: time="2025-11-25T18:45:56Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 25 13:45:56 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:45:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 13:45:56 np0005535656 podman[197580]: time="2025-11-25T18:45:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:45:56 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:45:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14065 "" "Go-http-client/1.1"
Nov 25 13:45:56 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:56.047Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 13:45:56 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:56.047Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 13:45:56 np0005535656 podman_exporter[197567]: ts=2025-11-25T18:45:56.048Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 13:45:56 np0005535656 podman[197578]: 2025-11-25 18:45:56.052467848 +0000 UTC m=+0.084247844 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:45:56 np0005535656 systemd[1]: 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763-83cfa305291569b.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 13:45:56 np0005535656 systemd[1]: 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763-83cfa305291569b.service: Failed with result 'exit-code'.
Nov 25 13:45:57 np0005535656 python3.9[197767]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:45:57 np0005535656 systemd[1]: Stopping podman_exporter container...
Nov 25 13:45:57 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:45:56 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 25 13:45:57 np0005535656 systemd[1]: libpod-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.scope: Deactivated successfully.
Nov 25 13:45:57 np0005535656 podman[197771]: 2025-11-25 18:45:57.264307974 +0000 UTC m=+0.081822189 container died 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:45:57 np0005535656 systemd[1]: 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763-83cfa305291569b.timer: Deactivated successfully.
Nov 25 13:45:57 np0005535656 systemd[1]: Stopped /usr/bin/podman healthcheck run 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.
Nov 25 13:45:57 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763-userdata-shm.mount: Deactivated successfully.
Nov 25 13:45:57 np0005535656 systemd[1]: var-lib-containers-storage-overlay-a7c8473bd77c330a646f784070da9ad42c5acd45bb6b16aa4b43712f880388ea-merged.mount: Deactivated successfully.
Nov 25 13:45:57 np0005535656 podman[197771]: 2025-11-25 18:45:57.495395348 +0000 UTC m=+0.312909573 container cleanup 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 13:45:57 np0005535656 podman[197771]: podman_exporter
Nov 25 13:45:57 np0005535656 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 13:45:57 np0005535656 podman[197802]: podman_exporter
Nov 25 13:45:57 np0005535656 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 25 13:45:57 np0005535656 systemd[1]: Stopped podman_exporter container.
Nov 25 13:45:57 np0005535656 systemd[1]: Starting podman_exporter container...
Nov 25 13:45:57 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:45:57 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c8473bd77c330a646f784070da9ad42c5acd45bb6b16aa4b43712f880388ea/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 13:45:57 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7c8473bd77c330a646f784070da9ad42c5acd45bb6b16aa4b43712f880388ea/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 13:45:57 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.
Nov 25 13:45:57 np0005535656 podman[197815]: 2025-11-25 18:45:57.785002011 +0000 UTC m=+0.168207489 container init 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.809Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.809Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.809Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.809Z caller=handler.go:105 level=info collector=container
Nov 25 13:45:57 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:45:57 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 13:45:57 np0005535656 podman[197580]: time="2025-11-25T18:45:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:45:57 np0005535656 podman[197815]: 2025-11-25 18:45:57.828139065 +0000 UTC m=+0.211344493 container start 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:45:57 np0005535656 podman[197815]: podman_exporter
Nov 25 13:45:57 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:45:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14067 "" "Go-http-client/1.1"
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.836Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.836Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 13:45:57 np0005535656 podman_exporter[197830]: ts=2025-11-25T18:45:57.837Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 13:45:57 np0005535656 systemd[1]: Started podman_exporter container.
Nov 25 13:45:57 np0005535656 podman[197840]: 2025-11-25 18:45:57.949705235 +0000 UTC m=+0.103673498 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:45:58 np0005535656 python3.9[198017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:45:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:45:59.056 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:45:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:45:59.057 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:45:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:45:59.057 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:45:59 np0005535656 python3.9[198140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096358.1730115-1120-237536327727057/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 13:46:00 np0005535656 python3.9[198292]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 25 13:46:01 np0005535656 python3.9[198444]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 13:46:02 np0005535656 python3[198598]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 13:46:05 np0005535656 podman[198653]: 2025-11-25 18:46:05.061824658 +0000 UTC m=+1.066538804 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:46:05 np0005535656 podman[198695]: 2025-11-25 18:46:05.343463871 +0000 UTC m=+0.270124874 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:46:05 np0005535656 podman[198610]: 2025-11-25 18:46:05.347314519 +0000 UTC m=+2.714128229 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 13:46:05 np0005535656 podman[198752]: 2025-11-25 18:46:05.510967375 +0000 UTC m=+0.062342976 container create 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 13:46:05 np0005535656 podman[198752]: 2025-11-25 18:46:05.480251831 +0000 UTC m=+0.031627492 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 13:46:05 np0005535656 python3[198598]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 13:46:06 np0005535656 python3.9[198942]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:46:07 np0005535656 python3.9[199096]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:08 np0005535656 python3.9[199247]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096367.5917108-1226-75603870246146/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:08 np0005535656 python3.9[199323]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 13:46:08 np0005535656 systemd[1]: Reloading.
Nov 25 13:46:09 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:46:09 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:46:09 np0005535656 python3.9[199435]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 13:46:10 np0005535656 systemd[1]: Reloading.
Nov 25 13:46:10 np0005535656 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 13:46:10 np0005535656 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 13:46:10 np0005535656 systemd[1]: Starting openstack_network_exporter container...
Nov 25 13:46:10 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:46:10 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:10 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:10 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:10 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.
Nov 25 13:46:10 np0005535656 podman[199475]: 2025-11-25 18:46:10.578075033 +0000 UTC m=+0.170128849 container init 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *bridge.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *coverage.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *datapath.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *iface.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *memory.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *ovnnorthd.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *ovn.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *ovsdbserver.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *pmd_perf.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *pmd_rxq.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: INFO    18:46:10 main.go:48: registering *vswitch.Collector
Nov 25 13:46:10 np0005535656 openstack_network_exporter[199492]: NOTICE  18:46:10 main.go:76: listening on https://:9105/metrics
Nov 25 13:46:10 np0005535656 podman[199475]: 2025-11-25 18:46:10.61494003 +0000 UTC m=+0.206993826 container start 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 25 13:46:10 np0005535656 podman[199475]: openstack_network_exporter
Nov 25 13:46:10 np0005535656 systemd[1]: Started openstack_network_exporter container.
Nov 25 13:46:10 np0005535656 podman[199502]: 2025-11-25 18:46:10.722446554 +0000 UTC m=+0.091323615 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 25 13:46:11 np0005535656 python3.9[199677]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 13:46:11 np0005535656 systemd[1]: Stopping openstack_network_exporter container...
Nov 25 13:46:11 np0005535656 systemd[1]: libpod-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.scope: Deactivated successfully.
Nov 25 13:46:11 np0005535656 podman[199681]: 2025-11-25 18:46:11.790769687 +0000 UTC m=+0.051762783 container died 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Nov 25 13:46:11 np0005535656 systemd[1]: 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7-2ead551aaa2d9fd0.timer: Deactivated successfully.
Nov 25 13:46:11 np0005535656 systemd[1]: Stopped /usr/bin/podman healthcheck run 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.
Nov 25 13:46:11 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7-userdata-shm.mount: Deactivated successfully.
Nov 25 13:46:11 np0005535656 systemd[1]: var-lib-containers-storage-overlay-fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029-merged.mount: Deactivated successfully.
Nov 25 13:46:12 np0005535656 podman[199681]: 2025-11-25 18:46:12.582878896 +0000 UTC m=+0.843871982 container cleanup 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public)
Nov 25 13:46:12 np0005535656 podman[199681]: openstack_network_exporter
Nov 25 13:46:12 np0005535656 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 13:46:12 np0005535656 podman[199710]: openstack_network_exporter
Nov 25 13:46:12 np0005535656 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 25 13:46:12 np0005535656 systemd[1]: Stopped openstack_network_exporter container.
Nov 25 13:46:12 np0005535656 systemd[1]: Starting openstack_network_exporter container...
Nov 25 13:46:12 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:46:12 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:12 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:12 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcfab3c50eae8036001473a768be90de91342b0ae6ceed3889a16e79a29b9029/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 13:46:12 np0005535656 systemd[1]: Started /usr/bin/podman healthcheck run 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.
Nov 25 13:46:12 np0005535656 podman[199723]: 2025-11-25 18:46:12.8361521 +0000 UTC m=+0.137342946 container init 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *bridge.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *coverage.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *datapath.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *iface.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *memory.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *ovnnorthd.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *ovn.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *ovsdbserver.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *pmd_perf.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *pmd_rxq.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: INFO    18:46:12 main.go:48: registering *vswitch.Collector
Nov 25 13:46:12 np0005535656 openstack_network_exporter[199738]: NOTICE  18:46:12 main.go:76: listening on https://:9105/metrics
Nov 25 13:46:12 np0005535656 podman[199723]: 2025-11-25 18:46:12.867582565 +0000 UTC m=+0.168773361 container start 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible)
Nov 25 13:46:12 np0005535656 podman[199723]: openstack_network_exporter
Nov 25 13:46:12 np0005535656 systemd[1]: Started openstack_network_exporter container.
Nov 25 13:46:12 np0005535656 podman[199748]: 2025-11-25 18:46:12.994698235 +0000 UTC m=+0.101636601 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, version=9.6, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 13:46:13 np0005535656 python3.9[199920]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 13:46:14 np0005535656 python3.9[200072]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 25 13:46:15 np0005535656 podman[200209]: 2025-11-25 18:46:15.941080061 +0000 UTC m=+0.117993798 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 25 13:46:16 np0005535656 python3.9[200251]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:16 np0005535656 systemd[1]: Started libpod-conmon-b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff.scope.
Nov 25 13:46:16 np0005535656 podman[200255]: 2025-11-25 18:46:16.191778333 +0000 UTC m=+0.103108403 container exec b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 25 13:46:16 np0005535656 podman[200255]: 2025-11-25 18:46:16.201237137 +0000 UTC m=+0.112567207 container exec_died b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 25 13:46:16 np0005535656 systemd[1]: libpod-conmon-b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff.scope: Deactivated successfully.
Nov 25 13:46:17 np0005535656 python3.9[200439]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:17 np0005535656 systemd[1]: Started libpod-conmon-b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff.scope.
Nov 25 13:46:17 np0005535656 podman[200440]: 2025-11-25 18:46:17.225776239 +0000 UTC m=+0.094045490 container exec b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 13:46:17 np0005535656 podman[200440]: 2025-11-25 18:46:17.261996178 +0000 UTC m=+0.130265409 container exec_died b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 13:46:17 np0005535656 systemd[1]: libpod-conmon-b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff.scope: Deactivated successfully.
Nov 25 13:46:18 np0005535656 python3.9[200624]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:19 np0005535656 python3.9[200776]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 25 13:46:20 np0005535656 python3.9[200941]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:20 np0005535656 systemd[1]: Started libpod-conmon-e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f.scope.
Nov 25 13:46:20 np0005535656 podman[200942]: 2025-11-25 18:46:20.459840917 +0000 UTC m=+0.107140445 container exec e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 13:46:20 np0005535656 podman[200942]: 2025-11-25 18:46:20.49804535 +0000 UTC m=+0.145344798 container exec_died e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:46:20 np0005535656 systemd[1]: libpod-conmon-e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f.scope: Deactivated successfully.
Nov 25 13:46:21 np0005535656 python3.9[201125]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:21 np0005535656 systemd[1]: Started libpod-conmon-e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f.scope.
Nov 25 13:46:21 np0005535656 podman[201126]: 2025-11-25 18:46:21.514711595 +0000 UTC m=+0.077104839 container exec e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 13:46:21 np0005535656 podman[201126]: 2025-11-25 18:46:21.548490695 +0000 UTC m=+0.110883949 container exec_died e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 13:46:21 np0005535656 systemd[1]: libpod-conmon-e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f.scope: Deactivated successfully.
Nov 25 13:46:21 np0005535656 auditd[704]: Audit daemon rotating log files
Nov 25 13:46:22 np0005535656 python3.9[201309]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:23 np0005535656 python3.9[201461]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 25 13:46:24 np0005535656 python3.9[201626]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:24 np0005535656 systemd[1]: Started libpod-conmon-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.scope.
Nov 25 13:46:24 np0005535656 podman[201627]: 2025-11-25 18:46:24.163851682 +0000 UTC m=+0.078012694 container exec 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:46:24 np0005535656 podman[201627]: 2025-11-25 18:46:24.200911964 +0000 UTC m=+0.115072936 container exec_died 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 13:46:24 np0005535656 systemd[1]: libpod-conmon-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.scope: Deactivated successfully.
Nov 25 13:46:24 np0005535656 python3.9[201811]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:25 np0005535656 systemd[1]: Started libpod-conmon-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.scope.
Nov 25 13:46:25 np0005535656 podman[201812]: 2025-11-25 18:46:25.040555408 +0000 UTC m=+0.070006661 container exec 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:46:25 np0005535656 podman[201812]: 2025-11-25 18:46:25.07082786 +0000 UTC m=+0.100279113 container exec_died 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:46:25 np0005535656 systemd[1]: libpod-conmon-1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60.scope: Deactivated successfully.
Nov 25 13:46:25 np0005535656 python3.9[201995]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:26 np0005535656 python3.9[202147]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 25 13:46:27 np0005535656 python3.9[202312]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:27 np0005535656 systemd[1]: Started libpod-conmon-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.scope.
Nov 25 13:46:27 np0005535656 podman[202313]: 2025-11-25 18:46:27.679026018 +0000 UTC m=+0.074225858 container exec 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:46:27 np0005535656 podman[202313]: 2025-11-25 18:46:27.708936961 +0000 UTC m=+0.104136781 container exec_died 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:46:27 np0005535656 systemd[1]: libpod-conmon-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.scope: Deactivated successfully.
Nov 25 13:46:28 np0005535656 podman[202469]: 2025-11-25 18:46:28.297265616 +0000 UTC m=+0.069920148 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:46:28 np0005535656 python3.9[202519]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:28 np0005535656 systemd[1]: Started libpod-conmon-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.scope.
Nov 25 13:46:28 np0005535656 podman[202523]: 2025-11-25 18:46:28.589057842 +0000 UTC m=+0.088692501 container exec 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:46:28 np0005535656 podman[202523]: 2025-11-25 18:46:28.62524204 +0000 UTC m=+0.124876709 container exec_died 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:46:28 np0005535656 systemd[1]: libpod-conmon-7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763.scope: Deactivated successfully.
Nov 25 13:46:29 np0005535656 python3.9[202707]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:30 np0005535656 python3.9[202859]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 25 13:46:31 np0005535656 python3.9[203024]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:31 np0005535656 systemd[1]: Started libpod-conmon-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.scope.
Nov 25 13:46:31 np0005535656 podman[203025]: 2025-11-25 18:46:31.322685692 +0000 UTC m=+0.076763968 container exec 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., version=9.6, architecture=x86_64)
Nov 25 13:46:31 np0005535656 podman[203025]: 2025-11-25 18:46:31.356849154 +0000 UTC m=+0.110927430 container exec_died 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 13:46:31 np0005535656 systemd[1]: libpod-conmon-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.scope: Deactivated successfully.
Nov 25 13:46:32 np0005535656 python3.9[203208]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 13:46:32 np0005535656 systemd[1]: Started libpod-conmon-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.scope.
Nov 25 13:46:32 np0005535656 podman[203209]: 2025-11-25 18:46:32.208694177 +0000 UTC m=+0.068955691 container exec 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 13:46:32 np0005535656 podman[203209]: 2025-11-25 18:46:32.244832604 +0000 UTC m=+0.105094098 container exec_died 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git)
Nov 25 13:46:32 np0005535656 systemd[1]: libpod-conmon-259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7.scope: Deactivated successfully.
Nov 25 13:46:33 np0005535656 python3.9[203393]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:35 np0005535656 podman[203419]: 2025-11-25 18:46:35.994958394 +0000 UTC m=+0.098543985 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 25 13:46:36 np0005535656 podman[203418]: 2025-11-25 18:46:36.017654666 +0000 UTC m=+0.131038050 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.112 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.113 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.138 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.139 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.139 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.139 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.693 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.694 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.720 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.720 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.721 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.721 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.909 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.910 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6068MB free_disk=73.19872283935547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.910 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.910 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.991 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:46:41 np0005535656 nova_compute[187219]: 2025-11-25 18:46:41.992 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:46:42 np0005535656 nova_compute[187219]: 2025-11-25 18:46:42.023 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:46:42 np0005535656 nova_compute[187219]: 2025-11-25 18:46:42.045 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:46:42 np0005535656 nova_compute[187219]: 2025-11-25 18:46:42.048 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:46:42 np0005535656 nova_compute[187219]: 2025-11-25 18:46:42.048 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:46:43 np0005535656 nova_compute[187219]: 2025-11-25 18:46:43.027 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:46:43 np0005535656 podman[203463]: 2025-11-25 18:46:43.991366462 +0000 UTC m=+0.113046430 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 25 13:46:46 np0005535656 podman[203485]: 2025-11-25 18:46:46.965940452 +0000 UTC m=+0.087388494 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:46:54 np0005535656 python3.9[203635]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:54 np0005535656 python3.9[203787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:46:55 np0005535656 python3.9[203910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096414.3537855-1656-183607973574862/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:56 np0005535656 python3.9[204062]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:57 np0005535656 python3.9[204214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:46:57 np0005535656 python3.9[204292]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:46:58 np0005535656 podman[204416]: 2025-11-25 18:46:58.582002345 +0000 UTC m=+0.098846773 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:46:58 np0005535656 python3.9[204465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:46:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:46:59.055 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:46:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:46:59.056 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:46:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:46:59.056 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:46:59 np0005535656 python3.9[204545]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zc2tagez recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:00 np0005535656 python3.9[204697]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:00 np0005535656 python3.9[204775]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:01 np0005535656 python3.9[204927]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:47:02 np0005535656 python3[205080]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 13:47:03 np0005535656 python3.9[205232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:04 np0005535656 python3.9[205310]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:05 np0005535656 python3.9[205462]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:05 np0005535656 python3.9[205540]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:06 np0005535656 podman[205665]: 2025-11-25 18:47:06.536935264 +0000 UTC m=+0.071918873 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 25 13:47:06 np0005535656 podman[205664]: 2025-11-25 18:47:06.60464663 +0000 UTC m=+0.139073224 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 13:47:06 np0005535656 python3.9[205726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:07 np0005535656 python3.9[205809]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:08 np0005535656 python3.9[205961]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:08 np0005535656 python3.9[206039]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:09 np0005535656 python3.9[206191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 13:47:10 np0005535656 python3.9[206316]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096428.8029263-1906-17083151022703/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:11 np0005535656 python3.9[206468]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:11 np0005535656 python3.9[206620]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:47:13 np0005535656 python3.9[206775]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:14 np0005535656 python3.9[206927]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:47:14 np0005535656 podman[207052]: 2025-11-25 18:47:14.823174221 +0000 UTC m=+0.079222908 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, name=ubi9-minimal, config_id=edpm, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 13:47:15 np0005535656 python3.9[207099]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 13:47:15 np0005535656 python3.9[207253]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 13:47:16 np0005535656 python3.9[207408]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 13:47:17 np0005535656 systemd[1]: session-27.scope: Deactivated successfully.
Nov 25 13:47:17 np0005535656 systemd[1]: session-27.scope: Consumed 1min 37.021s CPU time.
Nov 25 13:47:17 np0005535656 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Nov 25 13:47:17 np0005535656 systemd-logind[788]: Removed session 27.
Nov 25 13:47:17 np0005535656 podman[207433]: 2025-11-25 18:47:17.518921512 +0000 UTC m=+0.094089971 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:47:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:47:28 np0005535656 podman[207460]: 2025-11-25 18:47:28.978080491 +0000 UTC m=+0.082660323 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:47:35 np0005535656 podman[197580]: time="2025-11-25T18:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:47:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:47:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Nov 25 13:47:36 np0005535656 podman[207487]: 2025-11-25 18:47:36.97861722 +0000 UTC m=+0.089235896 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 13:47:37 np0005535656 podman[207486]: 2025-11-25 18:47:37.035314289 +0000 UTC m=+0.145886554 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:47:41 np0005535656 nova_compute[187219]: 2025-11-25 18:47:41.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:41 np0005535656 nova_compute[187219]: 2025-11-25 18:47:41.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:47:41 np0005535656 nova_compute[187219]: 2025-11-25 18:47:41.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:47:41 np0005535656 nova_compute[187219]: 2025-11-25 18:47:41.697 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:42 np0005535656 nova_compute[187219]: 2025-11-25 18:47:42.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:47:43 np0005535656 nova_compute[187219]: 2025-11-25 18:47:43.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:43 np0005535656 nova_compute[187219]: 2025-11-25 18:47:43.852 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:47:43 np0005535656 nova_compute[187219]: 2025-11-25 18:47:43.852 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:47:43 np0005535656 nova_compute[187219]: 2025-11-25 18:47:43.853 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:47:43 np0005535656 nova_compute[187219]: 2025-11-25 18:47:43.853 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.031 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.032 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6117MB free_disk=73.19845581054688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.032 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.033 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.203 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.203 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.222 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.299 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.301 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:47:44 np0005535656 nova_compute[187219]: 2025-11-25 18:47:44.301 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:47:44 np0005535656 podman[207530]: 2025-11-25 18:47:44.942995253 +0000 UTC m=+0.064339603 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64)
Nov 25 13:47:45 np0005535656 nova_compute[187219]: 2025-11-25 18:47:45.302 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:47:47 np0005535656 podman[207551]: 2025-11-25 18:47:47.968525659 +0000 UTC m=+0.084290248 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:47:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:47:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:47:59.057 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:47:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:47:59.058 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:47:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:47:59.058 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:47:59 np0005535656 podman[207571]: 2025-11-25 18:47:59.954270963 +0000 UTC m=+0.072002866 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:48:05 np0005535656 podman[197580]: time="2025-11-25T18:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:48:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:48:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Nov 25 13:48:07 np0005535656 podman[207602]: 2025-11-25 18:48:07.948891091 +0000 UTC m=+0.062560188 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:48:08 np0005535656 podman[207601]: 2025-11-25 18:48:08.00015032 +0000 UTC m=+0.120647116 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 25 13:48:16 np0005535656 podman[207648]: 2025-11-25 18:48:15.999981444 +0000 UTC m=+0.112633627 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git)
Nov 25 13:48:18 np0005535656 podman[207670]: 2025-11-25 18:48:18.965897509 +0000 UTC m=+0.077146080 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:48:30 np0005535656 podman[207691]: 2025-11-25 18:48:30.957579547 +0000 UTC m=+0.075775844 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 13:48:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:34.035 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:48:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:34.038 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:48:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:34.040 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:48:38 np0005535656 podman[207717]: 2025-11-25 18:48:38.996060282 +0000 UTC m=+0.099564188 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 25 13:48:39 np0005535656 podman[207716]: 2025-11-25 18:48:39.06113811 +0000 UTC m=+0.173797120 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 13:48:40 np0005535656 nova_compute[187219]: 2025-11-25 18:48:40.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:42 np0005535656 nova_compute[187219]: 2025-11-25 18:48:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:42 np0005535656 nova_compute[187219]: 2025-11-25 18:48:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:43 np0005535656 nova_compute[187219]: 2025-11-25 18:48:43.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:43 np0005535656 nova_compute[187219]: 2025-11-25 18:48:43.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:48:43 np0005535656 nova_compute[187219]: 2025-11-25 18:48:43.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:48:43 np0005535656 nova_compute[187219]: 2025-11-25 18:48:43.690 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.721 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.722 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.722 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.723 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.914 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.916 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6166MB free_disk=73.20235443115234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.916 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:48:44 np0005535656 nova_compute[187219]: 2025-11-25 18:48:44.916 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.009 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.010 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.045 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.081 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.084 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:48:45 np0005535656 nova_compute[187219]: 2025-11-25 18:48:45.085 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:48:46 np0005535656 podman[207760]: 2025-11-25 18:48:46.985934122 +0000 UTC m=+0.097427833 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 13:48:47 np0005535656 nova_compute[187219]: 2025-11-25 18:48:47.085 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:48:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:48:49 np0005535656 podman[207781]: 2025-11-25 18:48:49.948759484 +0000 UTC m=+0.068526210 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 25 13:48:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:59.059 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:48:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:59.059 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:48:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:48:59.060 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:49:01 np0005535656 podman[207801]: 2025-11-25 18:49:01.945977049 +0000 UTC m=+0.072925768 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:49:05 np0005535656 podman[197580]: time="2025-11-25T18:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:49:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:49:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 25 13:49:09 np0005535656 podman[207826]: 2025-11-25 18:49:09.991260577 +0000 UTC m=+0.088205647 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:49:10 np0005535656 podman[207825]: 2025-11-25 18:49:10.051252079 +0000 UTC m=+0.156553552 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 13:49:17 np0005535656 podman[207869]: 2025-11-25 18:49:17.987458532 +0000 UTC m=+0.102104614 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, version=9.6, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:49:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:49:20 np0005535656 podman[207890]: 2025-11-25 18:49:20.961156794 +0000 UTC m=+0.083929178 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 25 13:49:32 np0005535656 podman[207910]: 2025-11-25 18:49:32.966918681 +0000 UTC m=+0.072156813 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 13:49:35 np0005535656 podman[197580]: time="2025-11-25T18:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:49:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:49:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.695 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.696 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.697 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 13:49:40 np0005535656 nova_compute[187219]: 2025-11-25 18:49:40.710 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:40 np0005535656 podman[207937]: 2025-11-25 18:49:40.98304966 +0000 UTC m=+0.088742458 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 25 13:49:41 np0005535656 podman[207936]: 2025-11-25 18:49:41.026282886 +0000 UTC m=+0.137947874 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 13:49:43 np0005535656 nova_compute[187219]: 2025-11-25 18:49:43.720 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:43 np0005535656 nova_compute[187219]: 2025-11-25 18:49:43.721 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:49:43 np0005535656 nova_compute[187219]: 2025-11-25 18:49:43.722 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:49:43 np0005535656 nova_compute[187219]: 2025-11-25 18:49:43.742 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:49:43 np0005535656 nova_compute[187219]: 2025-11-25 18:49:43.743 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:44 np0005535656 nova_compute[187219]: 2025-11-25 18:49:44.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:44 np0005535656 nova_compute[187219]: 2025-11-25 18:49:44.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:44 np0005535656 nova_compute[187219]: 2025-11-25 18:49:44.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:45 np0005535656 nova_compute[187219]: 2025-11-25 18:49:45.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.674 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.708 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.709 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.972 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.973 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6173MB free_disk=73.20235443115234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.974 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:49:46 np0005535656 nova_compute[187219]: 2025-11-25 18:49:46.974 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.123 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.123 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.212 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.258 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.259 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.278 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.307 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.329 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.352 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.356 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:49:47 np0005535656 nova_compute[187219]: 2025-11-25 18:49:47.357 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:49:48 np0005535656 nova_compute[187219]: 2025-11-25 18:49:48.357 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:49:48 np0005535656 podman[207981]: 2025-11-25 18:49:48.952280001 +0000 UTC m=+0.064883509 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:49:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:49:51 np0005535656 podman[208001]: 2025-11-25 18:49:51.985236981 +0000 UTC m=+0.090576895 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:49:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:49:59.060 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:49:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:49:59.062 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:49:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:49:59.062 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:50:03 np0005535656 podman[208022]: 2025-11-25 18:50:03.959549166 +0000 UTC m=+0.076608823 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:50:05 np0005535656 podman[197580]: time="2025-11-25T18:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:50:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:50:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Nov 25 13:50:11 np0005535656 podman[208047]: 2025-11-25 18:50:11.980977358 +0000 UTC m=+0.089907128 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:50:12 np0005535656 podman[208046]: 2025-11-25 18:50:12.003407661 +0000 UTC m=+0.122025054 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:50:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:50:19 np0005535656 podman[208092]: 2025-11-25 18:50:19.984588187 +0000 UTC m=+0.095824949 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Nov 25 13:50:23 np0005535656 podman[208114]: 2025-11-25 18:50:23.004680984 +0000 UTC m=+0.120039177 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:50:34 np0005535656 podman[208136]: 2025-11-25 18:50:34.95673478 +0000 UTC m=+0.065658439 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:50:35 np0005535656 podman[197580]: time="2025-11-25T18:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:50:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:50:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 25 13:50:42 np0005535656 nova_compute[187219]: 2025-11-25 18:50:42.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:42 np0005535656 podman[208161]: 2025-11-25 18:50:42.960587899 +0000 UTC m=+0.073494225 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:50:43 np0005535656 podman[208160]: 2025-11-25 18:50:43.000691496 +0000 UTC m=+0.116236855 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 25 13:50:44 np0005535656 nova_compute[187219]: 2025-11-25 18:50:44.676 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:44 np0005535656 nova_compute[187219]: 2025-11-25 18:50:44.676 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:50:44 np0005535656 nova_compute[187219]: 2025-11-25 18:50:44.677 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:50:44 np0005535656 nova_compute[187219]: 2025-11-25 18:50:44.696 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:50:44 np0005535656 nova_compute[187219]: 2025-11-25 18:50:44.696 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.675 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.676 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.676 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.676 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.676 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:46 np0005535656 nova_compute[187219]: 2025-11-25 18:50:46.677 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.678 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.705 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.891 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.892 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6192MB free_disk=73.20270538330078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.893 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:50:47 np0005535656 nova_compute[187219]: 2025-11-25 18:50:47.893 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.027 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.027 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.055 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.067 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.069 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:50:48 np0005535656 nova_compute[187219]: 2025-11-25 18:50:48.069 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:50:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:50:50 np0005535656 nova_compute[187219]: 2025-11-25 18:50:50.065 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:50:50 np0005535656 podman[208203]: 2025-11-25 18:50:50.988162958 +0000 UTC m=+0.098054922 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Nov 25 13:50:53 np0005535656 podman[208225]: 2025-11-25 18:50:53.959601373 +0000 UTC m=+0.075759689 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 25 13:50:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:50:59.062 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:50:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:50:59.063 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:50:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:50:59.063 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:51:05 np0005535656 podman[197580]: time="2025-11-25T18:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:51:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:51:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Nov 25 13:51:05 np0005535656 podman[208249]: 2025-11-25 18:51:05.931775148 +0000 UTC m=+0.057775531 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:51:13 np0005535656 podman[208274]: 2025-11-25 18:51:13.97802311 +0000 UTC m=+0.086617767 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:51:14 np0005535656 podman[208273]: 2025-11-25 18:51:14.034588521 +0000 UTC m=+0.146355285 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:51:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:51:21 np0005535656 podman[208320]: 2025-11-25 18:51:21.981033912 +0000 UTC m=+0.097377815 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, release=1755695350, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 13:51:24 np0005535656 podman[208341]: 2025-11-25 18:51:24.967863246 +0000 UTC m=+0.079753215 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 13:51:35 np0005535656 podman[197580]: time="2025-11-25T18:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:51:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:51:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 25 13:51:36 np0005535656 podman[208361]: 2025-11-25 18:51:36.972838033 +0000 UTC m=+0.087546211 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:51:45 np0005535656 podman[208387]: 2025-11-25 18:51:45.012218963 +0000 UTC m=+0.115052688 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 13:51:45 np0005535656 podman[208386]: 2025-11-25 18:51:45.036777719 +0000 UTC m=+0.158300560 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 25 13:51:45 np0005535656 nova_compute[187219]: 2025-11-25 18:51:45.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.691 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:51:46 np0005535656 nova_compute[187219]: 2025-11-25 18:51:46.692 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:48 np0005535656 nova_compute[187219]: 2025-11-25 18:51:48.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:48 np0005535656 nova_compute[187219]: 2025-11-25 18:51:48.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:48 np0005535656 nova_compute[187219]: 2025-11-25 18:51:48.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:48 np0005535656 nova_compute[187219]: 2025-11-25 18:51:48.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:51:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.707 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.708 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.708 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.709 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.915 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.918 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6196MB free_disk=73.20203018188477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.918 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.918 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.983 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:51:49 np0005535656 nova_compute[187219]: 2025-11-25 18:51:49.984 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:51:50 np0005535656 nova_compute[187219]: 2025-11-25 18:51:50.011 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:51:50 np0005535656 nova_compute[187219]: 2025-11-25 18:51:50.027 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:51:50 np0005535656 nova_compute[187219]: 2025-11-25 18:51:50.030 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:51:50 np0005535656 nova_compute[187219]: 2025-11-25 18:51:50.031 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:51:52 np0005535656 nova_compute[187219]: 2025-11-25 18:51:52.033 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:51:52 np0005535656 podman[208433]: 2025-11-25 18:51:52.986629823 +0000 UTC m=+0.096653319 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, vendor=Red Hat, Inc.)
Nov 25 13:51:55 np0005535656 podman[208455]: 2025-11-25 18:51:55.989918238 +0000 UTC m=+0.093230418 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 13:51:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:51:59.064 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:51:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:51:59.065 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:51:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:51:59.065 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:04 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:04.211 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:52:04 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:04.213 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:52:05 np0005535656 podman[197580]: time="2025-11-25T18:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:52:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:52:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 13:52:07 np0005535656 podman[208476]: 2025-11-25 18:52:07.95366126 +0000 UTC m=+0.071097247 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:52:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:09.216 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:15 np0005535656 podman[208501]: 2025-11-25 18:52:15.974646465 +0000 UTC m=+0.087155295 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 13:52:16 np0005535656 podman[208500]: 2025-11-25 18:52:16.025938013 +0000 UTC m=+0.136662776 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:52:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:52:24 np0005535656 podman[208546]: 2025-11-25 18:52:24.006772496 +0000 UTC m=+0.119099847 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 13:52:26 np0005535656 podman[208568]: 2025-11-25 18:52:26.975598267 +0000 UTC m=+0.089959717 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 13:52:35 np0005535656 podman[197580]: time="2025-11-25T18:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:52:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:52:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 25 13:52:38 np0005535656 podman[208589]: 2025-11-25 18:52:38.963366012 +0000 UTC m=+0.079944586 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.751 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.752 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.772 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.914 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.915 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.923 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 13:52:40 np0005535656 nova_compute[187219]: 2025-11-25 18:52:40.923 187223 INFO nova.compute.claims [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.073 187223 DEBUG nova.compute.provider_tree [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.091 187223 DEBUG nova.scheduler.client.report [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.119 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.120 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.182 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.183 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.212 187223 INFO nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.237 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.332 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.335 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.335 187223 INFO nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Creating image(s)#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.337 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.338 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.339 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.340 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.341 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.972 187223 WARNING oslo_policy.policy [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.972 187223 WARNING oslo_policy.policy [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 25 13:52:41 np0005535656 nova_compute[187219]: 2025-11-25 18:52:41.974 187223 DEBUG nova.policy [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be3c7719092245a3b39ec72ada0c5247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f5f32749934e1bb4a31b5643dc964a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.456 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Successfully created port: 7fa3def6-9a8f-401b-8172-8dff9d7542e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.639 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.718 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.part --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.721 187223 DEBUG nova.virt.images [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] 1ea5e141-b92c-44f3-97b7-7b313587d3bf was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.725 187223 DEBUG nova.privsep.utils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.726 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.part /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.942 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.part /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.converted" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:43 np0005535656 nova_compute[187219]: 2025-11-25 18:52:43.951 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.027 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473.converted --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.030 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.061 187223 INFO oslo.privsep.daemon [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpdcq1jmzg/privsep.sock']#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.484 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Successfully updated port: 7fa3def6-9a8f-401b-8172-8dff9d7542e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.509 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.510 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquired lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.510 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.742 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.884 187223 INFO oslo.privsep.daemon [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.713 208634 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.719 208634 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.721 208634 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.721 208634 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208634#033[00m
Nov 25 13:52:44 np0005535656 nova_compute[187219]: 2025-11-25 18:52:44.978 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.057 187223 DEBUG nova.compute.manager [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-changed-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.058 187223 DEBUG nova.compute.manager [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Refreshing instance network info cache due to event network-changed-7fa3def6-9a8f-401b-8172-8dff9d7542e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.058 187223 DEBUG oslo_concurrency.lockutils [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.060 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.060 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.061 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.076 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.147 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.149 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.194 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.196 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.197 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.268 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.269 187223 DEBUG nova.virt.disk.api [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Checking if we can resize image /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.269 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.348 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.349 187223 DEBUG nova.virt.disk.api [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Cannot resize image /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.350 187223 DEBUG nova.objects.instance [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'migration_context' on Instance uuid 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.377 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.377 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Ensure instance console log exists: /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.378 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.379 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.379 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:45 np0005535656 nova_compute[187219]: 2025-11-25 18:52:45.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.594 187223 DEBUG nova.network.neutron [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updating instance_info_cache with network_info: [{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.623 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Releasing lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.623 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance network_info: |[{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.624 187223 DEBUG oslo_concurrency.lockutils [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.624 187223 DEBUG nova.network.neutron [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Refreshing network info cache for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.630 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Start _get_guest_xml network_info=[{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.637 187223 WARNING nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.652 187223 DEBUG nova.virt.libvirt.host [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.653 187223 DEBUG nova.virt.libvirt.host [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.657 187223 DEBUG nova.virt.libvirt.host [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.658 187223 DEBUG nova.virt.libvirt.host [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.660 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.660 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.661 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.661 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.662 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.662 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.663 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.663 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.664 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.664 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.664 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.665 187223 DEBUG nova.virt.hardware [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.671 187223 DEBUG nova.privsep.utils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.672 187223 DEBUG nova.virt.libvirt.vif [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1103197619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1103197619',id=1,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-m202b224',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:52:41Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=36d95b58-84fc-4d29-9aa8-0d0a919c9c72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.673 187223 DEBUG nova.network.os_vif_util [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.674 187223 DEBUG nova.network.os_vif_util [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.677 187223 DEBUG nova.objects.instance [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'pci_devices' on Instance uuid 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.679 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.740 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] End _get_guest_xml xml=<domain type="kvm">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <uuid>36d95b58-84fc-4d29-9aa8-0d0a919c9c72</uuid>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <name>instance-00000001</name>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-1103197619</nova:name>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 18:52:46</nova:creationTime>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:user uuid="be3c7719092245a3b39ec72ada0c5247">tempest-TestExecuteActionsViaActuator-53937300-project-member</nova:user>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:project uuid="90f5f32749934e1bb4a31b5643dc964a">tempest-TestExecuteActionsViaActuator-53937300</nova:project>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        <nova:port uuid="7fa3def6-9a8f-401b-8172-8dff9d7542e6">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <system>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="serial">36d95b58-84fc-4d29-9aa8-0d0a919c9c72</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="uuid">36d95b58-84fc-4d29-9aa8-0d0a919c9c72</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </system>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <os>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </clock>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:ff:5a:1b"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <target dev="tap7fa3def6-9a"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/console.log" append="off"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </serial>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <video>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 13:52:46 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 13:52:46 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:52:46 np0005535656 nova_compute[187219]: </domain>
Nov 25 13:52:46 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.743 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Preparing to wait for external event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.743 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.743 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.744 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.745 187223 DEBUG nova.virt.libvirt.vif [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1103197619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1103197619',id=1,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-m202b224',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:52:41Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=36d95b58-84fc-4d29-9aa8-0d0a919c9c72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.745 187223 DEBUG nova.network.os_vif_util [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.746 187223 DEBUG nova.network.os_vif_util [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.747 187223 DEBUG os_vif [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.792 187223 DEBUG ovsdbapp.backend.ovs_idl [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.792 187223 DEBUG ovsdbapp.backend.ovs_idl [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.792 187223 DEBUG ovsdbapp.backend.ovs_idl [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.793 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.793 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.794 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.795 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.796 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.796 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.796 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.797 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.801 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.808 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.808 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.809 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.809 187223 INFO oslo.privsep.daemon [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpd8vx4fu8/privsep.sock']#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.829 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 25 13:52:46 np0005535656 nova_compute[187219]: 2025-11-25 18:52:46.830 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:52:46 np0005535656 podman[208655]: 2025-11-25 18:52:46.989466524 +0000 UTC m=+0.095231690 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 13:52:47 np0005535656 podman[208654]: 2025-11-25 18:52:47.069341629 +0000 UTC m=+0.182101955 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.582 187223 INFO oslo.privsep.daemon [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.434 208701 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.442 208701 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.446 208701 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.446 208701 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208701#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.913 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.913 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fa3def6-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.914 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7fa3def6-9a, col_values=(('external_ids', {'iface-id': '7fa3def6-9a8f-401b-8172-8dff9d7542e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:5a:1b', 'vm-uuid': '36d95b58-84fc-4d29-9aa8-0d0a919c9c72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:47 np0005535656 NetworkManager[55548]: <info>  [1764096767.9184] manager: (tap7fa3def6-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.919 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.926 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:47 np0005535656 nova_compute[187219]: 2025-11-25 18:52:47.928 187223 INFO os_vif [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a')#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.018 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.019 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.019 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No VIF found with MAC fa:16:3e:ff:5a:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.020 187223 INFO nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Using config drive#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:48 np0005535656 nova_compute[187219]: 2025-11-25 18:52:48.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:52:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.662 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.988 187223 DEBUG nova.network.neutron [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updated VIF entry in instance network info cache for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:52:49 np0005535656 nova_compute[187219]: 2025-11-25 18:52:49.989 187223 DEBUG nova.network.neutron [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updating instance_info_cache with network_info: [{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.010 187223 DEBUG oslo_concurrency.lockutils [req-b41876c2-c9f4-4283-9a88-b4c629ac42a0 req-3e871ac2-8916-4b55-9ab4-346f200d3660 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.014 187223 INFO nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Creating config drive at /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.018 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrrfbfpj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.144 187223 DEBUG oslo_concurrency.processutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrrfbfpj" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:50 np0005535656 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 25 13:52:50 np0005535656 kernel: tap7fa3def6-9a: entered promiscuous mode
Nov 25 13:52:50 np0005535656 NetworkManager[55548]: <info>  [1764096770.2728] manager: (tap7fa3def6-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.275 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:50 np0005535656 ovn_controller[95460]: 2025-11-25T18:52:50Z|00027|binding|INFO|Claiming lport 7fa3def6-9a8f-401b-8172-8dff9d7542e6 for this chassis.
Nov 25 13:52:50 np0005535656 ovn_controller[95460]: 2025-11-25T18:52:50Z|00028|binding|INFO|7fa3def6-9a8f-401b-8172-8dff9d7542e6: Claiming fa:16:3e:ff:5a:1b 10.100.0.4
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.283 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:50 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.295 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:5a:1b 10.100.0.4'], port_security=['fa:16:3e:ff:5a:1b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '36d95b58-84fc-4d29-9aa8-0d0a919c9c72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=7fa3def6-9a8f-401b-8172-8dff9d7542e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:52:50 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.297 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 bound to our chassis#033[00m
Nov 25 13:52:50 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.303 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:52:50 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.305 104346 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp0xpqoy1m/privsep.sock']#033[00m
Nov 25 13:52:50 np0005535656 systemd-udevd[208731]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:52:50 np0005535656 NetworkManager[55548]: <info>  [1764096770.3463] device (tap7fa3def6-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:52:50 np0005535656 NetworkManager[55548]: <info>  [1764096770.3470] device (tap7fa3def6-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:52:50 np0005535656 systemd-machined[153481]: New machine qemu-1-instance-00000001.
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.383 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:50 np0005535656 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 25 13:52:50 np0005535656 ovn_controller[95460]: 2025-11-25T18:52:50Z|00029|binding|INFO|Setting lport 7fa3def6-9a8f-401b-8172-8dff9d7542e6 ovn-installed in OVS
Nov 25 13:52:50 np0005535656 ovn_controller[95460]: 2025-11-25T18:52:50Z|00030|binding|INFO|Setting lport 7fa3def6-9a8f-401b-8172-8dff9d7542e6 up in Southbound
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.390 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.702 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.703 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.703 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.704 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.812 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.909 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.911 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:52:50 np0005535656 nova_compute[187219]: 2025-11-25 18:52:50.990 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.058 104346 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.059 104346 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp0xpqoy1m/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.913 208749 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.919 208749 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.923 208749 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:50.923 208749 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208749#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.062 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e514b7-58ad-4670-9563-170eeefb67c6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.233 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.235 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6031MB free_disk=73.16709518432617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.236 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.239 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.244 187223 DEBUG nova.compute.manager [req-0dcf1516-1e65-4f38-b34a-0a5fde4f3907 req-ba945fa7-221b-492b-80b8-6b3fefe4468c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.244 187223 DEBUG oslo_concurrency.lockutils [req-0dcf1516-1e65-4f38-b34a-0a5fde4f3907 req-ba945fa7-221b-492b-80b8-6b3fefe4468c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.244 187223 DEBUG oslo_concurrency.lockutils [req-0dcf1516-1e65-4f38-b34a-0a5fde4f3907 req-ba945fa7-221b-492b-80b8-6b3fefe4468c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.245 187223 DEBUG oslo_concurrency.lockutils [req-0dcf1516-1e65-4f38-b34a-0a5fde4f3907 req-ba945fa7-221b-492b-80b8-6b3fefe4468c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.245 187223 DEBUG nova.compute.manager [req-0dcf1516-1e65-4f38-b34a-0a5fde4f3907 req-ba945fa7-221b-492b-80b8-6b3fefe4468c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Processing event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.309 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096771.309063, 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.310 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] VM Started (Lifecycle Event)#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.313 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.321 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.338 187223 INFO nova.virt.libvirt.driver [-] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance spawned successfully.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.339 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.348 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.348 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.349 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.356 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.360 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.390 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.390 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.390 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.391 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.391 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.392 187223 DEBUG nova.virt.libvirt.driver [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.395 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.395 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096771.310531, 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.395 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.420 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.423 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096771.3207772, 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.423 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.427 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.481 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.485 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.507 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.517 187223 INFO nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Took 10.18 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.518 187223 DEBUG nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.522 187223 ERROR nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [req-8a7d1512-51c2-4133-80f7-e312bcdacd87] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 752b63a7-2ce2-4d83-a281-12c9803714ea.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-8a7d1512-51c2-4133-80f7-e312bcdacd87"}]}#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.545 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.566 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.568 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.592 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.608 187223 INFO nova.compute.manager [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Took 10.73 seconds to build instance.#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.626 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.637 187223 DEBUG oslo_concurrency.lockutils [None req-2ad42caa-1c24-430e-a16a-6e8a4b8cd4de be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.642 208749 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.642 208749 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:51.642 208749 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.714 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.790 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updated inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.791 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.792 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.817 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:52:51 np0005535656 nova_compute[187219]: 2025-11-25 18:52:51.817 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.246 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4039da23-efe3-461f-9a88-4b398bbb6446]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.248 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe81e455-41 in ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.250 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe81e455-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.251 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[97680413-a82f-4e17-9838-272f0893117b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.254 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc1a535-68c9-4135-9b2f-eb606371c691]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.294 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[e489f33e-f3ae-40b7-bfd7-ab8d591b700d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.331 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c1708a0e-a0ca-4ed4-9c26-41cc989cc932]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:52 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:52.333 104346 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp925_udxp/privsep.sock']#033[00m
Nov 25 13:52:52 np0005535656 nova_compute[187219]: 2025-11-25 18:52:52.976 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.186 104346 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.188 104346 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp925_udxp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.055 208774 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.061 208774 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.063 208774 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.064 208774 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208774#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.192 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[1d222796-d57a-4442-b488-d1cf0509df06]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.338 187223 DEBUG nova.compute.manager [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.339 187223 DEBUG oslo_concurrency.lockutils [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.339 187223 DEBUG oslo_concurrency.lockutils [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.340 187223 DEBUG oslo_concurrency.lockutils [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.340 187223 DEBUG nova.compute.manager [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] No waiting events found dispatching network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.340 187223 WARNING nova.compute.manager [req-5d6d777d-78f8-4aa3-93de-0a91a9d3eb23 req-37608a79-bca1-491b-b111-61210855a7cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received unexpected event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 for instance with vm_state active and task_state None.#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.666 208774 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.666 208774 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:53.666 208774 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:53 np0005535656 nova_compute[187219]: 2025-11-25 18:52:53.820 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.230 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4ffcf5-a164-4178-b0e8-3a591d425ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 NetworkManager[55548]: <info>  [1764096774.2615] manager: (tapfe81e455-40): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.254 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1ee2ed-2418-465a-bd5a-27e82674f7c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.306 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[14008430-7cd5-4f91-a8bd-6fea2092b511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.312 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca39b17-0d4a-45cf-a596-83a6d5b3f2c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 systemd-udevd[208797]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:52:54 np0005535656 NetworkManager[55548]: <info>  [1764096774.3528] device (tapfe81e455-40): carrier: link connected
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.357 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[d15dd69d-c7b5-495f-918d-0e2cda1d271f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 podman[208783]: 2025-11-25 18:52:54.382568573 +0000 UTC m=+0.092824166 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.382 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1b41c4f0-9253-472d-b691-b01d0625d6cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373587, 'reachable_time': 32021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208822, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.408 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b8229ea0-8b08-4c15-90ab-b9638ecf893b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:a250'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373587, 'tstamp': 373587}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208825, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.425 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[57c88bb7-84f5-498c-ad5c-776c996af3a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373587, 'reachable_time': 32021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208826, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.457 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[06aa4d9f-4e71-483c-b1da-fdcf969688b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.532 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c04d4811-fc33-483f-ab14-604beb4387b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.535 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.536 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.537 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:54 np0005535656 nova_compute[187219]: 2025-11-25 18:52:54.541 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:54 np0005535656 kernel: tapfe81e455-40: entered promiscuous mode
Nov 25 13:52:54 np0005535656 NetworkManager[55548]: <info>  [1764096774.5433] manager: (tapfe81e455-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.548 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:52:54 np0005535656 ovn_controller[95460]: 2025-11-25T18:52:54Z|00031|binding|INFO|Releasing lport 035fc4d6-bdf9-4495-a5a8-2c835f3dfc48 from this chassis (sb_readonly=0)
Nov 25 13:52:54 np0005535656 nova_compute[187219]: 2025-11-25 18:52:54.550 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.553 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.555 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[92c154c2-c7ad-4c73-9908-6b4c7cdfbbd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.558 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-fe81e455-495f-4aea-8dd6-8b6f8cf5d198
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID fe81e455-495f-4aea-8dd6-8b6f8cf5d198
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 13:52:54 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:54.560 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'env', 'PROCESS_TAG=haproxy-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 13:52:54 np0005535656 nova_compute[187219]: 2025-11-25 18:52:54.561 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:54 np0005535656 nova_compute[187219]: 2025-11-25 18:52:54.663 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:55 np0005535656 podman[208857]: 2025-11-25 18:52:55.04401759 +0000 UTC m=+0.104364788 container create 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 13:52:55 np0005535656 podman[208857]: 2025-11-25 18:52:54.988254029 +0000 UTC m=+0.048601287 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:52:55 np0005535656 systemd[1]: Started libpod-conmon-8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6.scope.
Nov 25 13:52:55 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:52:55 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cac13af52bb08dff21398b7fac9b4820f9e1bc4b73eb0e44046cc2ae73c5368/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 13:52:55 np0005535656 podman[208857]: 2025-11-25 18:52:55.148336806 +0000 UTC m=+0.208683974 container init 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:52:55 np0005535656 podman[208857]: 2025-11-25 18:52:55.155217022 +0000 UTC m=+0.215564190 container start 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:52:55 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [NOTICE]   (208876) : New worker (208878) forked
Nov 25 13:52:55 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [NOTICE]   (208876) : Loading success.
Nov 25 13:52:57 np0005535656 podman[208887]: 2025-11-25 18:52:57.976895403 +0000 UTC m=+0.080496312 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 25 13:52:57 np0005535656 nova_compute[187219]: 2025-11-25 18:52:57.985 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:52:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:59.065 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:52:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:59.066 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:52:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:52:59.067 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:52:59 np0005535656 nova_compute[187219]: 2025-11-25 18:52:59.667 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:03 np0005535656 nova_compute[187219]: 2025-11-25 18:53:03.017 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:04 np0005535656 nova_compute[187219]: 2025-11-25 18:53:04.670 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:04 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:04Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:5a:1b 10.100.0.4
Nov 25 13:53:04 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:04Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:5a:1b 10.100.0.4
Nov 25 13:53:05 np0005535656 podman[197580]: time="2025-11-25T18:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:53:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:53:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3035 "" "Go-http-client/1.1"
Nov 25 13:53:08 np0005535656 nova_compute[187219]: 2025-11-25 18:53:08.020 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:08 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:08.931 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:53:08 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:08.932 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:53:08 np0005535656 nova_compute[187219]: 2025-11-25 18:53:08.932 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:09 np0005535656 nova_compute[187219]: 2025-11-25 18:53:09.674 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:09.936 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:09 np0005535656 podman[208927]: 2025-11-25 18:53:09.993836848 +0000 UTC m=+0.099940928 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 13:53:13 np0005535656 nova_compute[187219]: 2025-11-25 18:53:13.025 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:14 np0005535656 nova_compute[187219]: 2025-11-25 18:53:14.679 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:17 np0005535656 podman[208952]: 2025-11-25 18:53:17.964924292 +0000 UTC m=+0.072621217 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:53:18 np0005535656 podman[208951]: 2025-11-25 18:53:18.001784051 +0000 UTC m=+0.114105222 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:53:18 np0005535656 nova_compute[187219]: 2025-11-25 18:53:18.027 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:53:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:53:19 np0005535656 nova_compute[187219]: 2025-11-25 18:53:19.681 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:23 np0005535656 nova_compute[187219]: 2025-11-25 18:53:23.063 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:23 np0005535656 nova_compute[187219]: 2025-11-25 18:53:23.605 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:23 np0005535656 nova_compute[187219]: 2025-11-25 18:53:23.606 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:23 np0005535656 nova_compute[187219]: 2025-11-25 18:53:23.607 187223 DEBUG nova.network.neutron [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:53:24 np0005535656 nova_compute[187219]: 2025-11-25 18:53:24.684 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:25 np0005535656 podman[208993]: 2025-11-25 18:53:25.023696615 +0000 UTC m=+0.132813228 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 25 13:53:25 np0005535656 nova_compute[187219]: 2025-11-25 18:53:25.253 187223 DEBUG nova.network.neutron [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updating instance_info_cache with network_info: [{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:53:25 np0005535656 nova_compute[187219]: 2025-11-25 18:53:25.358 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:53:25 np0005535656 nova_compute[187219]: 2025-11-25 18:53:25.722 187223 DEBUG nova.virt.libvirt.driver [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 25 13:53:25 np0005535656 nova_compute[187219]: 2025-11-25 18:53:25.723 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Creating file /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/6cf3a6dada4d4b1b8d4942f31b92644f.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 25 13:53:25 np0005535656 nova_compute[187219]: 2025-11-25 18:53:25.724 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/6cf3a6dada4d4b1b8d4942f31b92644f.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.326 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/6cf3a6dada4d4b1b8d4942f31b92644f.tmp" returned: 1 in 0.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.327 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/6cf3a6dada4d4b1b8d4942f31b92644f.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.328 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Creating directory /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.329 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.578 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:26 np0005535656 nova_compute[187219]: 2025-11-25 18:53:26.585 187223 DEBUG nova.virt.libvirt.driver [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 25 13:53:28 np0005535656 nova_compute[187219]: 2025-11-25 18:53:28.067 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:28 np0005535656 kernel: tap7fa3def6-9a (unregistering): left promiscuous mode
Nov 25 13:53:28 np0005535656 NetworkManager[55548]: <info>  [1764096808.9887] device (tap7fa3def6-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:53:28 np0005535656 podman[209018]: 2025-11-25 18:53:28.991051169 +0000 UTC m=+0.095247311 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 13:53:29 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:29Z|00032|binding|INFO|Releasing lport 7fa3def6-9a8f-401b-8172-8dff9d7542e6 from this chassis (sb_readonly=0)
Nov 25 13:53:29 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:29Z|00033|binding|INFO|Setting lport 7fa3def6-9a8f-401b-8172-8dff9d7542e6 down in Southbound
Nov 25 13:53:29 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:29Z|00034|binding|INFO|Removing iface tap7fa3def6-9a ovn-installed in OVS
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.001 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.014 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:5a:1b 10.100.0.4'], port_security=['fa:16:3e:ff:5a:1b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '36d95b58-84fc-4d29-9aa8-0d0a919c9c72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=7fa3def6-9a8f-401b-8172-8dff9d7542e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.016 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 unbound from our chassis#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.018 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.019 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c61fc775-81d5-41cd-9504-a3a7a9e47553]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.019 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 namespace which is not needed anymore#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.033 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 25 13:53:29 np0005535656 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.822s CPU time.
Nov 25 13:53:29 np0005535656 systemd-machined[153481]: Machine qemu-1-instance-00000001 terminated.
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.221 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [NOTICE]   (208876) : haproxy version is 2.8.14-c23fe91
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [NOTICE]   (208876) : path to executable is /usr/sbin/haproxy
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [WARNING]  (208876) : Exiting Master process...
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [WARNING]  (208876) : Exiting Master process...
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [ALERT]    (208876) : Current worker (208878) exited with code 143 (Terminated)
Nov 25 13:53:29 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[208872]: [WARNING]  (208876) : All workers exited. Exiting... (0)
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.225 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 systemd[1]: libpod-8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6.scope: Deactivated successfully.
Nov 25 13:53:29 np0005535656 podman[209061]: 2025-11-25 18:53:29.23436006 +0000 UTC m=+0.067307924 container died 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.239 187223 DEBUG nova.compute.manager [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-unplugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.239 187223 DEBUG oslo_concurrency.lockutils [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.239 187223 DEBUG oslo_concurrency.lockutils [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.240 187223 DEBUG oslo_concurrency.lockutils [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.240 187223 DEBUG nova.compute.manager [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] No waiting events found dispatching network-vif-unplugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.240 187223 WARNING nova.compute.manager [req-be9c27a1-ba2a-45fb-84d5-053617fe0ecd req-7c283d1d-2cf7-4313-87ed-c8b19c3e95b3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received unexpected event network-vif-unplugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 25 13:53:29 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6-userdata-shm.mount: Deactivated successfully.
Nov 25 13:53:29 np0005535656 systemd[1]: var-lib-containers-storage-overlay-6cac13af52bb08dff21398b7fac9b4820f9e1bc4b73eb0e44046cc2ae73c5368-merged.mount: Deactivated successfully.
Nov 25 13:53:29 np0005535656 podman[209061]: 2025-11-25 18:53:29.294671713 +0000 UTC m=+0.127619607 container cleanup 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:53:29 np0005535656 systemd[1]: libpod-conmon-8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6.scope: Deactivated successfully.
Nov 25 13:53:29 np0005535656 podman[209106]: 2025-11-25 18:53:29.402373471 +0000 UTC m=+0.072530006 container remove 8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.412 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[87c60f5d-6187-4de0-a80d-88a699afd292]: (4, ('Tue Nov 25 06:53:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 (8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6)\n8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6\nTue Nov 25 06:53:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 (8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6)\n8bd8733242dd3a7359edcab8f43f49bed2b980d10058d877983d23cf26a077a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.414 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bf97e3ed-5176-4f3a-ae94-fd39800f536d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.415 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.418 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 kernel: tapfe81e455-40: left promiscuous mode
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.446 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.451 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae21d09-203f-4894-957e-897cc5080ad3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.470 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[141bc72e-6077-4237-a8b6-900c69402a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.471 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5286fc92-afdf-4136-ae58-f54495d6fbaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.495 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6498e094-c66b-4147-8a23-190d9ddcd21e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373575, 'reachable_time': 15445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209125, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 systemd[1]: run-netns-ovnmeta\x2dfe81e455\x2d495f\x2d4aea\x2d8dd6\x2d8b6f8cf5d198.mount: Deactivated successfully.
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.509 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 13:53:29 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:29.510 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[10040530-8cf6-4f27-a6b5-c291d7bbac35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.609 187223 INFO nova.virt.libvirt.driver [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance shutdown successfully after 3 seconds.#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.616 187223 INFO nova.virt.libvirt.driver [-] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Instance destroyed successfully.#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.617 187223 DEBUG nova.virt.libvirt.vif [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1103197619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1103197619',id=1,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:52:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-m202b224',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:53:22Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=36d95b58-84fc-4d29-9aa8-0d0a919c9c72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "vif_mac": "fa:16:3e:ff:5a:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.618 187223 DEBUG nova.network.os_vif_util [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "vif_mac": "fa:16:3e:ff:5a:1b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.620 187223 DEBUG nova.network.os_vif_util [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.620 187223 DEBUG os_vif [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.622 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.623 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fa3def6-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.624 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.626 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.630 187223 INFO os_vif [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a')#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.635 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.687 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.730 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.731 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.796 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.798 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk to 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 13:53:29 np0005535656 nova_compute[187219]: 2025-11-25 18:53:29.798 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.447 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk" returned: 0 in 0.648s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.448 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.448 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.config 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.700 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -C -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.config 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.config" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.702 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.702 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.info 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:30 np0005535656 nova_compute[187219]: 2025-11-25 18:53:30.962 187223 DEBUG oslo_concurrency.processutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -C -r /var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72_resize/disk.info 192.168.122.100:/var/lib/nova/instances/36d95b58-84fc-4d29-9aa8-0d0a919c9c72/disk.info" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.184 187223 DEBUG neutronclient.v2_0.client [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.248 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.248 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.258 187223 INFO nova.compute.rpcapi [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.259 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.281 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.282 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.282 187223 DEBUG oslo_concurrency.lockutils [None req-fb919fcd-51ea-4b9b-816e-7091b5c721fe fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.329 187223 DEBUG nova.compute.manager [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.330 187223 DEBUG oslo_concurrency.lockutils [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.330 187223 DEBUG oslo_concurrency.lockutils [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.331 187223 DEBUG oslo_concurrency.lockutils [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.331 187223 DEBUG nova.compute.manager [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] No waiting events found dispatching network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:53:31 np0005535656 nova_compute[187219]: 2025-11-25 18:53:31.331 187223 WARNING nova.compute.manager [req-aad733d7-1450-45d8-8338-ebfbad6f97a7 req-b9f10793-ac71-48ee-842d-45c6071ac2fa 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received unexpected event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 25 13:53:33 np0005535656 nova_compute[187219]: 2025-11-25 18:53:33.208 187223 DEBUG nova.compute.manager [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-changed-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:33 np0005535656 nova_compute[187219]: 2025-11-25 18:53:33.209 187223 DEBUG nova.compute.manager [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Refreshing instance network info cache due to event network-changed-7fa3def6-9a8f-401b-8172-8dff9d7542e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:53:33 np0005535656 nova_compute[187219]: 2025-11-25 18:53:33.210 187223 DEBUG oslo_concurrency.lockutils [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:33 np0005535656 nova_compute[187219]: 2025-11-25 18:53:33.210 187223 DEBUG oslo_concurrency.lockutils [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:33 np0005535656 nova_compute[187219]: 2025-11-25 18:53:33.210 187223 DEBUG nova.network.neutron [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Refreshing network info cache for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:53:34 np0005535656 nova_compute[187219]: 2025-11-25 18:53:34.650 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:34 np0005535656 nova_compute[187219]: 2025-11-25 18:53:34.688 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:35 np0005535656 nova_compute[187219]: 2025-11-25 18:53:35.345 187223 DEBUG nova.network.neutron [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updated VIF entry in instance network info cache for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:53:35 np0005535656 nova_compute[187219]: 2025-11-25 18:53:35.347 187223 DEBUG nova.network.neutron [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updating instance_info_cache with network_info: [{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:53:35 np0005535656 nova_compute[187219]: 2025-11-25 18:53:35.369 187223 DEBUG oslo_concurrency.lockutils [req-1028c5d3-8e5a-4744-9ae8-e750386aa062 req-d8eb6c39-78b7-49a3-901f-1192c536cec7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:53:35 np0005535656 podman[197580]: time="2025-11-25T18:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:53:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:53:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.610 187223 DEBUG nova.compute.manager [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.611 187223 DEBUG oslo_concurrency.lockutils [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.612 187223 DEBUG oslo_concurrency.lockutils [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.612 187223 DEBUG oslo_concurrency.lockutils [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.613 187223 DEBUG nova.compute.manager [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] No waiting events found dispatching network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:53:36 np0005535656 nova_compute[187219]: 2025-11-25 18:53:36.613 187223 WARNING nova.compute.manager [req-a20ce535-8093-4722-8925-e46a8b7c4263 req-58d01b5b-ad8f-42b6-8f96-fa1c22be5b46 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received unexpected event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.755 187223 DEBUG nova.compute.manager [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.756 187223 DEBUG oslo_concurrency.lockutils [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.757 187223 DEBUG oslo_concurrency.lockutils [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.757 187223 DEBUG oslo_concurrency.lockutils [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.758 187223 DEBUG nova.compute.manager [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] No waiting events found dispatching network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:53:38 np0005535656 nova_compute[187219]: 2025-11-25 18:53:38.758 187223 WARNING nova.compute.manager [req-47f9cf39-a1b8-4a1b-8f3a-27809d33a278 req-dc8aa625-753f-4d9f-bfc8-7e8a2024e5e8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Received unexpected event network-vif-plugged-7fa3def6-9a8f-401b-8172-8dff9d7542e6 for instance with vm_state resized and task_state None.#033[00m
Nov 25 13:53:39 np0005535656 nova_compute[187219]: 2025-11-25 18:53:39.652 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:39 np0005535656 nova_compute[187219]: 2025-11-25 18:53:39.690 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:40 np0005535656 podman[209141]: 2025-11-25 18:53:40.992119206 +0000 UTC m=+0.093426802 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:53:41 np0005535656 nova_compute[187219]: 2025-11-25 18:53:41.510 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:41 np0005535656 nova_compute[187219]: 2025-11-25 18:53:41.511 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:41 np0005535656 nova_compute[187219]: 2025-11-25 18:53:41.511 187223 DEBUG nova.compute.manager [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 25 13:53:42 np0005535656 nova_compute[187219]: 2025-11-25 18:53:42.231 187223 DEBUG neutronclient.v2_0.client [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 7fa3def6-9a8f-401b-8172-8dff9d7542e6 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 25 13:53:42 np0005535656 nova_compute[187219]: 2025-11-25 18:53:42.232 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:42 np0005535656 nova_compute[187219]: 2025-11-25 18:53:42.233 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:42 np0005535656 nova_compute[187219]: 2025-11-25 18:53:42.233 187223 DEBUG nova.network.neutron [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:53:42 np0005535656 nova_compute[187219]: 2025-11-25 18:53:42.233 187223 DEBUG nova.objects.instance [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'info_cache' on Instance uuid 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.275 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764096809.274417, 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.276 187223 INFO nova.compute.manager [-] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.401 187223 DEBUG nova.compute.manager [None req-99f81a3e-ff1d-44c6-b1cf-cd0f02695e3d - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.406 187223 DEBUG nova.compute.manager [None req-99f81a3e-ff1d-44c6-b1cf-cd0f02695e3d - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.445 187223 INFO nova.compute.manager [None req-99f81a3e-ff1d-44c6-b1cf-cd0f02695e3d - - - - - -] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.654 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:44 np0005535656 nova_compute[187219]: 2025-11-25 18:53:44.693 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.202 187223 DEBUG nova.network.neutron [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 36d95b58-84fc-4d29-9aa8-0d0a919c9c72] Updating instance_info_cache with network_info: [{"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.261 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-36d95b58-84fc-4d29-9aa8-0d0a919c9c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.262 187223 DEBUG nova.objects.instance [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 36d95b58-84fc-4d29-9aa8-0d0a919c9c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.298 187223 DEBUG nova.virt.libvirt.host [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.298 187223 INFO nova.virt.libvirt.host [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] UEFI support detected#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.300 187223 DEBUG nova.virt.libvirt.vif [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:52:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1103197619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1103197619',id=1,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:53:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-m202b224',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:53:37Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=36d95b58-84fc-4d29-9aa8-0d0a919c9c72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.301 187223 DEBUG nova.network.os_vif_util [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "address": "fa:16:3e:ff:5a:1b", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3def6-9a", "ovs_interfaceid": "7fa3def6-9a8f-401b-8172-8dff9d7542e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.302 187223 DEBUG nova.network.os_vif_util [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.302 187223 DEBUG os_vif [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.304 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.304 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fa3def6-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.304 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.307 187223 INFO os_vif [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5a:1b,bridge_name='br-int',has_traffic_filtering=True,id=7fa3def6-9a8f-401b-8172-8dff9d7542e6,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3def6-9a')#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.307 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.307 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.405 187223 DEBUG nova.compute.provider_tree [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.425 187223 DEBUG nova.scheduler.client.report [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.486 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.727 187223 INFO nova.scheduler.client.report [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 656517aa-0166-46b0-946d-1ac93299fd38#033[00m
Nov 25 13:53:46 np0005535656 nova_compute[187219]: 2025-11-25 18:53:46.814 187223 DEBUG oslo_concurrency.lockutils [None req-665b1013-ea44-48cd-ab6d-48393b429dd9 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "36d95b58-84fc-4d29-9aa8-0d0a919c9c72" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:47 np0005535656 nova_compute[187219]: 2025-11-25 18:53:47.679 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.572 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.573 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.604 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.691 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.707 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.708 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.721 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.721 187223 INFO nova.compute.claims [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.901 187223 DEBUG nova.compute.provider_tree [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:53:48 np0005535656 nova_compute[187219]: 2025-11-25 18:53:48.922 187223 DEBUG nova.scheduler.client.report [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:53:48 np0005535656 podman[209166]: 2025-11-25 18:53:48.939629892 +0000 UTC m=+0.056468211 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 25 13:53:49 np0005535656 podman[209165]: 2025-11-25 18:53:49.019406092 +0000 UTC m=+0.139218671 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.082 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.083 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.222 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.222 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.245 187223 INFO nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.376 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:53:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.484 187223 DEBUG nova.policy [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be3c7719092245a3b39ec72ada0c5247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f5f32749934e1bb4a31b5643dc964a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.656 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:53:49 np0005535656 nova_compute[187219]: 2025-11-25 18:53:49.694 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:50 np0005535656 nova_compute[187219]: 2025-11-25 18:53:50.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:51 np0005535656 nova_compute[187219]: 2025-11-25 18:53:51.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:52 np0005535656 nova_compute[187219]: 2025-11-25 18:53:52.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.056 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.056 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.058 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.058 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.122 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.126 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.127 187223 INFO nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Creating image(s)#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.128 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.129 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.130 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.152 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.229 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.231 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.232 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.243 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.301 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.302 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.339 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.340 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5916MB free_disk=73.16780090332031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.340 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.341 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.350 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.351 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.351 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.409 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.410 187223 DEBUG nova.virt.disk.api [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Checking if we can resize image /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.411 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.462 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 135f8d09-972f-4564-a9cf-74128ae9320a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.463 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.463 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.469 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.470 187223 DEBUG nova.virt.disk.api [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Cannot resize image /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.471 187223 DEBUG nova.objects.instance [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'migration_context' on Instance uuid 135f8d09-972f-4564-a9cf-74128ae9320a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.489 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.490 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Ensure instance console log exists: /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.491 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.491 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.492 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.521 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.544 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.578 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.579 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:53 np0005535656 nova_compute[187219]: 2025-11-25 18:53:53.999 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Successfully created port: 906ded83-fa3f-44e8-a187-2d7233b49cba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.659 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.697 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.754 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Successfully updated port: 906ded83-fa3f-44e8-a187-2d7233b49cba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.792 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.793 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquired lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.793 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.909 187223 DEBUG nova.compute.manager [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-changed-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.910 187223 DEBUG nova.compute.manager [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Refreshing instance network info cache due to event network-changed-906ded83-fa3f-44e8-a187-2d7233b49cba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:53:54 np0005535656 nova_compute[187219]: 2025-11-25 18:53:54.911 187223 DEBUG oslo_concurrency.lockutils [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:53:55 np0005535656 nova_compute[187219]: 2025-11-25 18:53:55.006 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 13:53:55 np0005535656 nova_compute[187219]: 2025-11-25 18:53:55.579 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:53:55 np0005535656 podman[209226]: 2025-11-25 18:53:55.976792958 +0000 UTC m=+0.086905675 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.196 187223 DEBUG nova.network.neutron [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updating instance_info_cache with network_info: [{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.241 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Releasing lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.242 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Instance network_info: |[{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.242 187223 DEBUG oslo_concurrency.lockutils [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.243 187223 DEBUG nova.network.neutron [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Refreshing network info cache for port 906ded83-fa3f-44e8-a187-2d7233b49cba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.248 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Start _get_guest_xml network_info=[{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.253 187223 WARNING nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.259 187223 DEBUG nova.virt.libvirt.host [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.260 187223 DEBUG nova.virt.libvirt.host [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.263 187223 DEBUG nova.virt.libvirt.host [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.264 187223 DEBUG nova.virt.libvirt.host [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.266 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.266 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.267 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.268 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.268 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.268 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.269 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.269 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.270 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.270 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.271 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.271 187223 DEBUG nova.virt.hardware [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.277 187223 DEBUG nova.virt.libvirt.vif [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:53:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2108820933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2108820933',id=3,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-t4hk0dbk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:53:51Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=135f8d09-972f-4564-a9cf-74128ae9320a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.278 187223 DEBUG nova.network.os_vif_util [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.279 187223 DEBUG nova.network.os_vif_util [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.281 187223 DEBUG nova.objects.instance [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'pci_devices' on Instance uuid 135f8d09-972f-4564-a9cf-74128ae9320a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.316 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] End _get_guest_xml xml=<domain type="kvm">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <uuid>135f8d09-972f-4564-a9cf-74128ae9320a</uuid>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <name>instance-00000003</name>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-2108820933</nova:name>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 18:53:57</nova:creationTime>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:user uuid="be3c7719092245a3b39ec72ada0c5247">tempest-TestExecuteActionsViaActuator-53937300-project-member</nova:user>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:project uuid="90f5f32749934e1bb4a31b5643dc964a">tempest-TestExecuteActionsViaActuator-53937300</nova:project>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        <nova:port uuid="906ded83-fa3f-44e8-a187-2d7233b49cba">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <system>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="serial">135f8d09-972f-4564-a9cf-74128ae9320a</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="uuid">135f8d09-972f-4564-a9cf-74128ae9320a</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </system>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <os>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </clock>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.config"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:10:6f:c4"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <target dev="tap906ded83-fa"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/console.log" append="off"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </serial>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <video>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 13:53:57 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 13:53:57 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:53:57 np0005535656 nova_compute[187219]: </domain>
Nov 25 13:53:57 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.317 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Preparing to wait for external event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.318 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.318 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.318 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.320 187223 DEBUG nova.virt.libvirt.vif [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:53:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2108820933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2108820933',id=3,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-t4hk0dbk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:53:51Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=135f8d09-972f-4564-a9cf-74128ae9320a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.320 187223 DEBUG nova.network.os_vif_util [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.321 187223 DEBUG nova.network.os_vif_util [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.322 187223 DEBUG os_vif [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.323 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.323 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.324 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.328 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.328 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap906ded83-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.329 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap906ded83-fa, col_values=(('external_ids', {'iface-id': '906ded83-fa3f-44e8-a187-2d7233b49cba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:6f:c4', 'vm-uuid': '135f8d09-972f-4564-a9cf-74128ae9320a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.331 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:57 np0005535656 NetworkManager[55548]: <info>  [1764096837.3326] manager: (tap906ded83-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.336 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.338 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.340 187223 INFO os_vif [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa')#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.443 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.443 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.444 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No VIF found with MAC fa:16:3e:10:6f:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.445 187223 INFO nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Using config drive#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.987 187223 INFO nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Creating config drive at /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.config#033[00m
Nov 25 13:53:57 np0005535656 nova_compute[187219]: 2025-11-25 18:53:57.997 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphtbewkt3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.126 187223 DEBUG oslo_concurrency.processutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphtbewkt3" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:53:58 np0005535656 kernel: tap906ded83-fa: entered promiscuous mode
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.1951] manager: (tap906ded83-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Nov 25 13:53:58 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:58Z|00035|binding|INFO|Claiming lport 906ded83-fa3f-44e8-a187-2d7233b49cba for this chassis.
Nov 25 13:53:58 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:58Z|00036|binding|INFO|906ded83-fa3f-44e8-a187-2d7233b49cba: Claiming fa:16:3e:10:6f:c4 10.100.0.10
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.195 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.209 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:58Z|00037|binding|INFO|Setting lport 906ded83-fa3f-44e8-a187-2d7233b49cba ovn-installed in OVS
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.213 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:58Z|00038|binding|INFO|Setting lport 906ded83-fa3f-44e8-a187-2d7233b49cba up in Southbound
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.216 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.216 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:6f:c4 10.100.0.10'], port_security=['fa:16:3e:10:6f:c4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '135f8d09-972f-4564-a9cf-74128ae9320a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=906ded83-fa3f-44e8-a187-2d7233b49cba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.218 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 906ded83-fa3f-44e8-a187-2d7233b49cba in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 bound to our chassis#033[00m
Nov 25 13:53:58 np0005535656 systemd-udevd[209264]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.220 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:53:58 np0005535656 systemd-machined[153481]: New machine qemu-2-instance-00000003.
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.231 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[28df8225-894c-4bc7-9bdc-070d9b9def0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.232 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe81e455-41 in ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.234 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe81e455-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.234 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7e5141-d1f9-4feb-bef6-ff5cd5c98717]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.235 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3b777c1d-4349-4971-9fd0-122d49b27c4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.2377] device (tap906ded83-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.2387] device (tap906ded83-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:53:58 np0005535656 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.247 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[ade14c8e-cfbe-4f4b-9587-66a821acfb16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.271 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1918c308-8060-4484-959e-e233301b6a24]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.297 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[945c632d-cd8b-486e-833e-b2bc87789f2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.302 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2736ae65-5ef5-4bd6-8b29-7c6b5ac8466b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.3038] manager: (tapfe81e455-40): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.332 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[bad46e65-f509-4b8f-8d9c-7da744186978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.336 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8591bf-5c63-40a8-a970-3f03b531fd50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.3573] device (tapfe81e455-40): carrier: link connected
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.360 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[60697e49-7434-40a5-9654-872cc83d1e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.375 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[db21af48-0a30-48cd-9f7d-77c0036e36b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209299, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.386 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[13084da4-2f61-4f51-9632-4730451d7f7e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:a250'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379988, 'tstamp': 379988}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209300, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.400 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[cb952ab7-7792-4e15-aeeb-fa6fbe30ecc1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209301, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.426 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7eddc88a-1098-4a7f-88d3-18399ea6022a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.482 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d457d619-0652-4ad7-af8a-1704cfc2fa9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.484 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.484 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.485 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:58 np0005535656 NetworkManager[55548]: <info>  [1764096838.4882] manager: (tapfe81e455-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.487 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 kernel: tapfe81e455-40: entered promiscuous mode
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.493 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.492 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_controller[95460]: 2025-11-25T18:53:58Z|00039|binding|INFO|Releasing lport 035fc4d6-bdf9-4495-a5a8-2c835f3dfc48 from this chassis (sb_readonly=0)
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.496 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.509 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 13:53:58 np0005535656 nova_compute[187219]: 2025-11-25 18:53:58.508 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.510 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[05bc3e9d-7b4e-41cc-912b-1abeae01fcba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.511 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-fe81e455-495f-4aea-8dd6-8b6f8cf5d198
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.pid.haproxy
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID fe81e455-495f-4aea-8dd6-8b6f8cf5d198
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 13:53:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:58.512 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'env', 'PROCESS_TAG=haproxy-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe81e455-495f-4aea-8dd6-8b6f8cf5d198.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 13:53:58 np0005535656 podman[209333]: 2025-11-25 18:53:58.875536196 +0000 UTC m=+0.030483576 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:53:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:59.068 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:53:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:59.069 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:53:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:53:59.070 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:53:59 np0005535656 podman[209333]: 2025-11-25 18:53:59.190169399 +0000 UTC m=+0.345116719 container create b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:53:59 np0005535656 systemd[1]: Started libpod-conmon-b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2.scope.
Nov 25 13:53:59 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:53:59 np0005535656 podman[209346]: 2025-11-25 18:53:59.316882082 +0000 UTC m=+0.086277399 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 13:53:59 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7dfd84f56621b9dd0dbf7e3cdfffa00a959b180e039a39f51faffae8371e203/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 13:53:59 np0005535656 podman[209333]: 2025-11-25 18:53:59.340152501 +0000 UTC m=+0.495099811 container init b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.340 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096839.340213, 135f8d09-972f-4564-a9cf-74128ae9320a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.341 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] VM Started (Lifecycle Event)#033[00m
Nov 25 13:53:59 np0005535656 podman[209333]: 2025-11-25 18:53:59.348642311 +0000 UTC m=+0.503589601 container start b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 13:53:59 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [NOTICE]   (209377) : New worker (209380) forked
Nov 25 13:53:59 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [NOTICE]   (209377) : Loading success.
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.389 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.393 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096839.3413274, 135f8d09-972f-4564-a9cf-74128ae9320a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.394 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.422 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.427 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.453 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:53:59 np0005535656 nova_compute[187219]: 2025-11-25 18:53:59.699 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.256 187223 DEBUG nova.compute.manager [req-402ca0db-c940-4631-9e03-6c469713b9d6 req-2ca885e2-142b-4dd8-886e-3ddc2eb56455 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.256 187223 DEBUG oslo_concurrency.lockutils [req-402ca0db-c940-4631-9e03-6c469713b9d6 req-2ca885e2-142b-4dd8-886e-3ddc2eb56455 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.256 187223 DEBUG oslo_concurrency.lockutils [req-402ca0db-c940-4631-9e03-6c469713b9d6 req-2ca885e2-142b-4dd8-886e-3ddc2eb56455 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.257 187223 DEBUG oslo_concurrency.lockutils [req-402ca0db-c940-4631-9e03-6c469713b9d6 req-2ca885e2-142b-4dd8-886e-3ddc2eb56455 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.257 187223 DEBUG nova.compute.manager [req-402ca0db-c940-4631-9e03-6c469713b9d6 req-2ca885e2-142b-4dd8-886e-3ddc2eb56455 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Processing event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.258 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.262 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096840.2621477, 135f8d09-972f-4564-a9cf-74128ae9320a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.262 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.264 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.267 187223 INFO nova.virt.libvirt.driver [-] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Instance spawned successfully.#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.269 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.342 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.348 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.349 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.350 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.351 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.351 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.352 187223 DEBUG nova.virt.libvirt.driver [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.359 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.427 187223 DEBUG nova.network.neutron [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updated VIF entry in instance network info cache for port 906ded83-fa3f-44e8-a187-2d7233b49cba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.428 187223 DEBUG nova.network.neutron [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updating instance_info_cache with network_info: [{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.479 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.611 187223 DEBUG oslo_concurrency.lockutils [req-e67849a9-76d1-4e4f-a901-30bc8e4cf1d4 req-0bdfb71f-0c81-4675-89a2-d7d30b28f2b0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.621 187223 INFO nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Took 7.50 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.622 187223 DEBUG nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:00 np0005535656 nova_compute[187219]: 2025-11-25 18:54:00.692 187223 INFO nova.compute.manager [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Took 12.01 seconds to build instance.#033[00m
Nov 25 13:54:01 np0005535656 nova_compute[187219]: 2025-11-25 18:54:01.061 187223 DEBUG oslo_concurrency.lockutils [None req-a15cedbb-d7f9-4e47-b413-d09d4f8a5618 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.332 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.595 187223 DEBUG nova.compute.manager [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.596 187223 DEBUG oslo_concurrency.lockutils [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.596 187223 DEBUG oslo_concurrency.lockutils [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.596 187223 DEBUG oslo_concurrency.lockutils [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.597 187223 DEBUG nova.compute.manager [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:54:02 np0005535656 nova_compute[187219]: 2025-11-25 18:54:02.597 187223 WARNING nova.compute.manager [req-ce6c6e84-1ada-4381-a3ef-b96e64efb7bb req-7a0d9212-128d-4953-bff7-91a06c2e47bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state None.#033[00m
Nov 25 13:54:04 np0005535656 nova_compute[187219]: 2025-11-25 18:54:04.730 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:05 np0005535656 podman[197580]: time="2025-11-25T18:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:54:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:54:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 25 13:54:07 np0005535656 nova_compute[187219]: 2025-11-25 18:54:07.338 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:09 np0005535656 nova_compute[187219]: 2025-11-25 18:54:09.732 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:11 np0005535656 podman[209390]: 2025-11-25 18:54:11.969495896 +0000 UTC m=+0.089946687 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:54:12 np0005535656 nova_compute[187219]: 2025-11-25 18:54:12.340 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:14 np0005535656 nova_compute[187219]: 2025-11-25 18:54:14.776 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:17 np0005535656 nova_compute[187219]: 2025-11-25 18:54:17.342 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:17 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:17Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:6f:c4 10.100.0.10
Nov 25 13:54:17 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:17Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:6f:c4 10.100.0.10
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:54:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:54:19 np0005535656 nova_compute[187219]: 2025-11-25 18:54:19.821 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:19.842 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:54:19 np0005535656 nova_compute[187219]: 2025-11-25 18:54:19.844 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:19.845 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:54:19 np0005535656 podman[209439]: 2025-11-25 18:54:19.949088052 +0000 UTC m=+0.067335124 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 13:54:20 np0005535656 podman[209438]: 2025-11-25 18:54:20.054481408 +0000 UTC m=+0.161216209 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 13:54:22 np0005535656 nova_compute[187219]: 2025-11-25 18:54:22.345 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:24 np0005535656 nova_compute[187219]: 2025-11-25 18:54:24.824 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:25.849 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:27 np0005535656 podman[209482]: 2025-11-25 18:54:27.005654325 +0000 UTC m=+0.118889321 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 13:54:27 np0005535656 nova_compute[187219]: 2025-11-25 18:54:27.347 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:29 np0005535656 nova_compute[187219]: 2025-11-25 18:54:29.827 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:29 np0005535656 podman[209504]: 2025-11-25 18:54:29.949758903 +0000 UTC m=+0.075804994 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 13:54:32 np0005535656 nova_compute[187219]: 2025-11-25 18:54:32.348 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:34 np0005535656 nova_compute[187219]: 2025-11-25 18:54:34.830 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:35 np0005535656 podman[197580]: time="2025-11-25T18:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:54:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:54:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 25 13:54:37 np0005535656 nova_compute[187219]: 2025-11-25 18:54:37.351 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:39 np0005535656 nova_compute[187219]: 2025-11-25 18:54:39.840 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:42 np0005535656 nova_compute[187219]: 2025-11-25 18:54:42.354 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:42 np0005535656 nova_compute[187219]: 2025-11-25 18:54:42.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:42 np0005535656 nova_compute[187219]: 2025-11-25 18:54:42.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 13:54:42 np0005535656 podman[209526]: 2025-11-25 18:54:42.962903358 +0000 UTC m=+0.072979425 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.339 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.339 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.368 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.498 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.499 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.514 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.515 187223 INFO nova.compute.claims [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.716 187223 DEBUG nova.compute.provider_tree [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.740 187223 DEBUG nova.scheduler.client.report [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.772 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.773 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.825 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.826 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.848 187223 INFO nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 13:54:43 np0005535656 nova_compute[187219]: 2025-11-25 18:54:43.871 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.066 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.067 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.068 187223 INFO nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Creating image(s)#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.068 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.069 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.070 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.087 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.111 187223 DEBUG nova.policy [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be3c7719092245a3b39ec72ada0c5247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f5f32749934e1bb4a31b5643dc964a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.165 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.167 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.167 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.188 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.257 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.259 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.301 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.302 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.302 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.357 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.358 187223 DEBUG nova.virt.disk.api [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Checking if we can resize image /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.359 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.415 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.416 187223 DEBUG nova.virt.disk.api [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Cannot resize image /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.417 187223 DEBUG nova.objects.instance [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'migration_context' on Instance uuid af1b51b4-9c51-443a-932e-a48750d61085 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.442 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.443 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Ensure instance console log exists: /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.443 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.443 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.444 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:44 np0005535656 nova_compute[187219]: 2025-11-25 18:54:44.842 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:46 np0005535656 nova_compute[187219]: 2025-11-25 18:54:46.053 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Successfully created port: 9274e936-3662-499c-89b5-4b605917aad2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 13:54:46 np0005535656 nova_compute[187219]: 2025-11-25 18:54:46.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.336 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Successfully updated port: 9274e936-3662-499c-89b5-4b605917aad2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.358 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.362 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.363 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquired lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.363 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.485 187223 DEBUG nova.compute.manager [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-changed-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.486 187223 DEBUG nova.compute.manager [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Refreshing instance network info cache due to event network-changed-9274e936-3662-499c-89b5-4b605917aad2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.486 187223 DEBUG oslo_concurrency.lockutils [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.605 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 13:54:47 np0005535656 nova_compute[187219]: 2025-11-25 18:54:47.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.748 187223 DEBUG nova.network.neutron [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updating instance_info_cache with network_info: [{"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.776 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Releasing lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.776 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Instance network_info: |[{"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.777 187223 DEBUG oslo_concurrency.lockutils [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.777 187223 DEBUG nova.network.neutron [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Refreshing network info cache for port 9274e936-3662-499c-89b5-4b605917aad2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.780 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Start _get_guest_xml network_info=[{"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.785 187223 WARNING nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.789 187223 DEBUG nova.virt.libvirt.host [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.790 187223 DEBUG nova.virt.libvirt.host [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.799 187223 DEBUG nova.virt.libvirt.host [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.800 187223 DEBUG nova.virt.libvirt.host [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.801 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.802 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.803 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.803 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.804 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.804 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.804 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.805 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.805 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.806 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.806 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.807 187223 DEBUG nova.virt.hardware [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.812 187223 DEBUG nova.virt.libvirt.vif [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:54:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-535417570',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-535417570',id=5,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-5b4iwpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:54:43Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=af1b51b4-9c51-443a-932e-a48750d61085,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.813 187223 DEBUG nova.network.os_vif_util [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.814 187223 DEBUG nova.network.os_vif_util [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.815 187223 DEBUG nova.objects.instance [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'pci_devices' on Instance uuid af1b51b4-9c51-443a-932e-a48750d61085 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.833 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] End _get_guest_xml xml=<domain type="kvm">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <uuid>af1b51b4-9c51-443a-932e-a48750d61085</uuid>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <name>instance-00000005</name>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-535417570</nova:name>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 18:54:48</nova:creationTime>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:user uuid="be3c7719092245a3b39ec72ada0c5247">tempest-TestExecuteActionsViaActuator-53937300-project-member</nova:user>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:project uuid="90f5f32749934e1bb4a31b5643dc964a">tempest-TestExecuteActionsViaActuator-53937300</nova:project>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        <nova:port uuid="9274e936-3662-499c-89b5-4b605917aad2">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <system>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="serial">af1b51b4-9c51-443a-932e-a48750d61085</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="uuid">af1b51b4-9c51-443a-932e-a48750d61085</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </system>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <os>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </clock>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.config"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:35:9a:75"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <target dev="tap9274e936-36"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/console.log" append="off"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </serial>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <video>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 13:54:48 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 13:54:48 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:54:48 np0005535656 nova_compute[187219]: </domain>
Nov 25 13:54:48 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.834 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Preparing to wait for external event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.835 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.835 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.835 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.836 187223 DEBUG nova.virt.libvirt.vif [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:54:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-535417570',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-535417570',id=5,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-5b4iwpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:54:43Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=af1b51b4-9c51-443a-932e-a48750d61085,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.836 187223 DEBUG nova.network.os_vif_util [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.837 187223 DEBUG nova.network.os_vif_util [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.837 187223 DEBUG os_vif [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.838 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.838 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.838 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.842 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.842 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9274e936-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.843 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9274e936-36, col_values=(('external_ids', {'iface-id': '9274e936-3662-499c-89b5-4b605917aad2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:9a:75', 'vm-uuid': 'af1b51b4-9c51-443a-932e-a48750d61085'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.844 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:48 np0005535656 NetworkManager[55548]: <info>  [1764096888.8451] manager: (tap9274e936-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.847 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.851 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.852 187223 INFO os_vif [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36')#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.911 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.911 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.912 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No VIF found with MAC fa:16:3e:35:9a:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.912 187223 INFO nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Using config drive#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.959 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.960 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.960 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 13:54:48 np0005535656 nova_compute[187219]: 2025-11-25 18:54:48.961 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 135f8d09-972f-4564-a9cf-74128ae9320a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:54:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.464 187223 INFO nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Creating config drive at /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.config#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.469 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphqb0_lk7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.592 187223 DEBUG oslo_concurrency.processutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphqb0_lk7" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:49 np0005535656 kernel: tap9274e936-36: entered promiscuous mode
Nov 25 13:54:49 np0005535656 NetworkManager[55548]: <info>  [1764096889.6486] manager: (tap9274e936-36): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.681 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:49Z|00040|binding|INFO|Claiming lport 9274e936-3662-499c-89b5-4b605917aad2 for this chassis.
Nov 25 13:54:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:49Z|00041|binding|INFO|9274e936-3662-499c-89b5-4b605917aad2: Claiming fa:16:3e:35:9a:75 10.100.0.13
Nov 25 13:54:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:49Z|00042|binding|INFO|Setting lport 9274e936-3662-499c-89b5-4b605917aad2 ovn-installed in OVS
Nov 25 13:54:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:54:49Z|00043|binding|INFO|Setting lport 9274e936-3662-499c-89b5-4b605917aad2 up in Southbound
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.700 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:9a:75 10.100.0.13'], port_security=['fa:16:3e:35:9a:75 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'af1b51b4-9c51-443a-932e-a48750d61085', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9274e936-3662-499c-89b5-4b605917aad2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.701 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9274e936-3662-499c-89b5-4b605917aad2 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 bound to our chassis#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.703 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.704 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.706 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 systemd-machined[153481]: New machine qemu-3-instance-00000005.
Nov 25 13:54:49 np0005535656 systemd-udevd[209586]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.717 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b0adc5-cbfc-45ce-a344-44ee154babc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Nov 25 13:54:49 np0005535656 NetworkManager[55548]: <info>  [1764096889.7302] device (tap9274e936-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:54:49 np0005535656 NetworkManager[55548]: <info>  [1764096889.7315] device (tap9274e936-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.754 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[d86c09c6-a643-454c-8a1c-3eec9be4abdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.758 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[58baee23-dec1-4472-8ccc-5e2ff754dc30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.794 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[fac77420-10bd-4ca8-b885-f7f90298ce00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.816 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce88a0e-3ae3-4178-9c2e-999f8c8a34e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209599, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.841 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cc5a36-d066-47b0-bc07-4e86778c52c6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379997, 'tstamp': 379997}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209600, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379999, 'tstamp': 379999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209600, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.843 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.844 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.846 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 nova_compute[187219]: 2025-11-25 18:54:49.847 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.848 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.848 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.849 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:54:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:49.850 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.177 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096890.1768246, af1b51b4-9c51-443a-932e-a48750d61085 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.178 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] VM Started (Lifecycle Event)#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.201 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.205 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096890.177255, af1b51b4-9c51-443a-932e-a48750d61085 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.206 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.235 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.238 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.275 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.286 187223 DEBUG nova.compute.manager [req-217cedc5-cdd8-44e4-ab57-67472c6192f0 req-61778c76-6338-4678-885d-b512873de0bc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.287 187223 DEBUG oslo_concurrency.lockutils [req-217cedc5-cdd8-44e4-ab57-67472c6192f0 req-61778c76-6338-4678-885d-b512873de0bc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.287 187223 DEBUG oslo_concurrency.lockutils [req-217cedc5-cdd8-44e4-ab57-67472c6192f0 req-61778c76-6338-4678-885d-b512873de0bc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.287 187223 DEBUG oslo_concurrency.lockutils [req-217cedc5-cdd8-44e4-ab57-67472c6192f0 req-61778c76-6338-4678-885d-b512873de0bc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.287 187223 DEBUG nova.compute.manager [req-217cedc5-cdd8-44e4-ab57-67472c6192f0 req-61778c76-6338-4678-885d-b512873de0bc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Processing event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.288 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.292 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.292 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096890.291095, af1b51b4-9c51-443a-932e-a48750d61085 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.292 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.296 187223 INFO nova.virt.libvirt.driver [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Instance spawned successfully.#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.296 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.333 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.334 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.334 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.335 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.335 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.336 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.336 187223 DEBUG nova.virt.libvirt.driver [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.340 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.385 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.444 187223 INFO nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Took 6.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.444 187223 DEBUG nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.532 187223 INFO nova.compute.manager [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Took 7.09 seconds to build instance.#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.550 187223 DEBUG oslo_concurrency.lockutils [None req-910cd562-23ff-465a-b9ae-d49204c940bc be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.565 187223 DEBUG nova.network.neutron [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updated VIF entry in instance network info cache for port 9274e936-3662-499c-89b5-4b605917aad2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.566 187223 DEBUG nova.network.neutron [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updating instance_info_cache with network_info: [{"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:54:50 np0005535656 nova_compute[187219]: 2025-11-25 18:54:50.581 187223 DEBUG oslo_concurrency.lockutils [req-e4c641e0-10c9-4ae1-a1e3-a6b80f9ddd7b req-21d08afe-2374-4efc-b997-73073ac45f05 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:54:50 np0005535656 podman[209609]: 2025-11-25 18:54:50.978640927 +0000 UTC m=+0.078815256 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 13:54:51 np0005535656 podman[209608]: 2025-11-25 18:54:51.014034468 +0000 UTC m=+0.117125757 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.046 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updating instance_info_cache with network_info: [{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.076 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.076 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.077 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.077 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.077 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.077 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.078 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.078 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.098 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 13:54:51 np0005535656 nova_compute[187219]: 2025-11-25 18:54:51.694 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:52 np0005535656 nova_compute[187219]: 2025-11-25 18:54:52.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.306 187223 DEBUG nova.compute.manager [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.307 187223 DEBUG oslo_concurrency.lockutils [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.307 187223 DEBUG oslo_concurrency.lockutils [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.307 187223 DEBUG oslo_concurrency.lockutils [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.307 187223 DEBUG nova.compute.manager [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] No waiting events found dispatching network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.307 187223 WARNING nova.compute.manager [req-21e8fa71-51d2-449c-9919-0b4b8bb26404 req-4a83e212-a143-49e3-9723-593f3f8a86fc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received unexpected event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 for instance with vm_state active and task_state None.#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.363 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.364 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.364 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.364 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.456 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.517 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.518 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.572 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.579 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.639 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.640 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.699 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.845 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.847 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.848 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5567MB free_disk=73.13775634765625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.848 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.849 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.995 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 135f8d09-972f-4564-a9cf-74128ae9320a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.995 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance af1b51b4-9c51-443a-932e-a48750d61085 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.995 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:54:53 np0005535656 nova_compute[187219]: 2025-11-25 18:54:53.995 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:54:54 np0005535656 nova_compute[187219]: 2025-11-25 18:54:54.150 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:54:54 np0005535656 nova_compute[187219]: 2025-11-25 18:54:54.188 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:54:54 np0005535656 nova_compute[187219]: 2025-11-25 18:54:54.212 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:54:54 np0005535656 nova_compute[187219]: 2025-11-25 18:54:54.212 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:54 np0005535656 nova_compute[187219]: 2025-11-25 18:54:54.847 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:57 np0005535656 nova_compute[187219]: 2025-11-25 18:54:57.213 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:57 np0005535656 nova_compute[187219]: 2025-11-25 18:54:57.214 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:54:57 np0005535656 podman[209665]: 2025-11-25 18:54:57.951263807 +0000 UTC m=+0.063624889 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=)
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.803 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Triggering sync for uuid 135f8d09-972f-4564-a9cf-74128ae9320a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.803 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Triggering sync for uuid af1b51b4-9c51-443a-932e-a48750d61085 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.804 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.804 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "135f8d09-972f-4564-a9cf-74128ae9320a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.805 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.805 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "af1b51b4-9c51-443a-932e-a48750d61085" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.865 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "135f8d09-972f-4564-a9cf-74128ae9320a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.867 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "af1b51b4-9c51-443a-932e-a48750d61085" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:58 np0005535656 nova_compute[187219]: 2025-11-25 18:54:58.892 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:54:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:59.068 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:54:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:59.071 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:54:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:54:59.071 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:54:59 np0005535656 nova_compute[187219]: 2025-11-25 18:54:59.866 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:00 np0005535656 podman[209688]: 2025-11-25 18:55:00.987281771 +0000 UTC m=+0.086233048 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.720 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.721 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.742 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.850 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.851 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.858 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 13:55:02 np0005535656 nova_compute[187219]: 2025-11-25 18:55:02.859 187223 INFO nova.compute.claims [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.107 187223 DEBUG nova.compute.provider_tree [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.129 187223 DEBUG nova.scheduler.client.report [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.162 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.163 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.229 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.230 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.256 187223 INFO nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.298 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.610 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.612 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.612 187223 INFO nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Creating image(s)#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.613 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.613 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.615 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.636 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.704 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.705 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.706 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.718 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.773 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.775 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.806 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.807 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.808 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.858 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.860 187223 DEBUG nova.virt.disk.api [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Checking if we can resize image /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.860 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.896 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.913 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.914 187223 DEBUG nova.virt.disk.api [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Cannot resize image /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.914 187223 DEBUG nova.objects.instance [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'migration_context' on Instance uuid 482cc299-5b06-4501-a819-6556a71a4ad2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.943 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.943 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Ensure instance console log exists: /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.944 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.944 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:03 np0005535656 nova_compute[187219]: 2025-11-25 18:55:03.944 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:04 np0005535656 nova_compute[187219]: 2025-11-25 18:55:04.064 187223 DEBUG nova.policy [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be3c7719092245a3b39ec72ada0c5247', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f5f32749934e1bb4a31b5643dc964a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 13:55:04 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:04Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:9a:75 10.100.0.13
Nov 25 13:55:04 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:04Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:9a:75 10.100.0.13
Nov 25 13:55:04 np0005535656 nova_compute[187219]: 2025-11-25 18:55:04.874 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:05 np0005535656 nova_compute[187219]: 2025-11-25 18:55:05.398 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Successfully created port: 9583d685-83d5-480e-a534-d81a55c68f50 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 13:55:05 np0005535656 podman[197580]: time="2025-11-25T18:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:55:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:55:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.210 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Successfully updated port: 9583d685-83d5-480e-a534-d81a55c68f50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.240 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.240 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquired lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.240 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.344 187223 DEBUG nova.compute.manager [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-changed-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.345 187223 DEBUG nova.compute.manager [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Refreshing instance network info cache due to event network-changed-9583d685-83d5-480e-a534-d81a55c68f50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.345 187223 DEBUG oslo_concurrency.lockutils [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:55:07 np0005535656 nova_compute[187219]: 2025-11-25 18:55:07.497 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 13:55:08 np0005535656 nova_compute[187219]: 2025-11-25 18:55:08.903 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.075 187223 DEBUG nova.network.neutron [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Updating instance_info_cache with network_info: [{"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.117 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Releasing lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.118 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Instance network_info: |[{"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.118 187223 DEBUG oslo_concurrency.lockutils [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.118 187223 DEBUG nova.network.neutron [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Refreshing network info cache for port 9583d685-83d5-480e-a534-d81a55c68f50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.120 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Start _get_guest_xml network_info=[{"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.124 187223 WARNING nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.129 187223 DEBUG nova.virt.libvirt.host [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.130 187223 DEBUG nova.virt.libvirt.host [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.135 187223 DEBUG nova.virt.libvirt.host [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.137 187223 DEBUG nova.virt.libvirt.host [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.139 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.139 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.140 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.141 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.141 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.142 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.142 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.143 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.143 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.144 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.144 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.145 187223 DEBUG nova.virt.hardware [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.151 187223 DEBUG nova.virt.libvirt.vif [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:54:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-731786201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-731786201',id=6,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-0w2a48af',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:55:03Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=482cc299-5b06-4501-a819-6556a71a4ad2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.151 187223 DEBUG nova.network.os_vif_util [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.152 187223 DEBUG nova.network.os_vif_util [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.154 187223 DEBUG nova.objects.instance [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'pci_devices' on Instance uuid 482cc299-5b06-4501-a819-6556a71a4ad2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.176 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] End _get_guest_xml xml=<domain type="kvm">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <uuid>482cc299-5b06-4501-a819-6556a71a4ad2</uuid>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <name>instance-00000006</name>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-731786201</nova:name>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 18:55:09</nova:creationTime>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:user uuid="be3c7719092245a3b39ec72ada0c5247">tempest-TestExecuteActionsViaActuator-53937300-project-member</nova:user>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:project uuid="90f5f32749934e1bb4a31b5643dc964a">tempest-TestExecuteActionsViaActuator-53937300</nova:project>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        <nova:port uuid="9583d685-83d5-480e-a534-d81a55c68f50">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <system>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="serial">482cc299-5b06-4501-a819-6556a71a4ad2</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="uuid">482cc299-5b06-4501-a819-6556a71a4ad2</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </system>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <os>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </clock>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.config"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:40:50:89"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <target dev="tap9583d685-83"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/console.log" append="off"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </serial>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <video>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 13:55:09 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 13:55:09 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:55:09 np0005535656 nova_compute[187219]: </domain>
Nov 25 13:55:09 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.178 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Preparing to wait for external event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.178 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.179 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.179 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.180 187223 DEBUG nova.virt.libvirt.vif [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:54:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-731786201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-731786201',id=6,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-0w2a48af',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:55:03Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=482cc299-5b06-4501-a819-6556a71a4ad2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.180 187223 DEBUG nova.network.os_vif_util [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.181 187223 DEBUG nova.network.os_vif_util [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.181 187223 DEBUG os_vif [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.182 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.182 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.183 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.189 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.189 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9583d685-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.190 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9583d685-83, col_values=(('external_ids', {'iface-id': '9583d685-83d5-480e-a534-d81a55c68f50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:50:89', 'vm-uuid': '482cc299-5b06-4501-a819-6556a71a4ad2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.226 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 NetworkManager[55548]: <info>  [1764096909.2274] manager: (tap9583d685-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.230 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.236 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.238 187223 INFO os_vif [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83')#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.304 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.305 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.305 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] No VIF found with MAC fa:16:3e:40:50:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.306 187223 INFO nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Using config drive#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.877 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.912 187223 INFO nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Creating config drive at /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.config#033[00m
Nov 25 13:55:09 np0005535656 nova_compute[187219]: 2025-11-25 18:55:09.919 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkketzehp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.057 187223 DEBUG oslo_concurrency.processutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkketzehp" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:10 np0005535656 kernel: tap9583d685-83: entered promiscuous mode
Nov 25 13:55:10 np0005535656 NetworkManager[55548]: <info>  [1764096910.1450] manager: (tap9583d685-83): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 25 13:55:10 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:10Z|00044|binding|INFO|Claiming lport 9583d685-83d5-480e-a534-d81a55c68f50 for this chassis.
Nov 25 13:55:10 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:10Z|00045|binding|INFO|9583d685-83d5-480e-a534-d81a55c68f50: Claiming fa:16:3e:40:50:89 10.100.0.5
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.145 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.160 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:50:89 10.100.0.5'], port_security=['fa:16:3e:40:50:89 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '482cc299-5b06-4501-a819-6556a71a4ad2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9583d685-83d5-480e-a534-d81a55c68f50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.162 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9583d685-83d5-480e-a534-d81a55c68f50 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 bound to our chassis#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.164 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.165 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:55:10 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:10Z|00046|binding|INFO|Setting lport 9583d685-83d5-480e-a534-d81a55c68f50 ovn-installed in OVS
Nov 25 13:55:10 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:10Z|00047|binding|INFO|Setting lport 9583d685-83d5-480e-a534-d81a55c68f50 up in Southbound
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.168 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.172 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.184 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[389dd348-33fd-406f-a68e-2e5a8bc064ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 systemd-machined[153481]: New machine qemu-4-instance-00000006.
Nov 25 13:55:10 np0005535656 systemd-udevd[209764]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:55:10 np0005535656 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.220 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0a9e68-362c-4b02-adfe-89c22bcad59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 NetworkManager[55548]: <info>  [1764096910.2252] device (tap9583d685-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:55:10 np0005535656 NetworkManager[55548]: <info>  [1764096910.2265] device (tap9583d685-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.227 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b3867523-2461-481c-81de-b2959df0eb77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.268 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[97a3d769-e825-4ef6-b568-07f9e8074233]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.293 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a26fd11d-c147-4c3e-8b30-9563961d0729]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209774, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.316 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d9b8c8-ae76-4ecc-9c37-3bbc7b8e904a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379997, 'tstamp': 379997}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209776, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379999, 'tstamp': 379999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209776, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.319 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.321 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.322 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.324 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.324 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.325 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:10.325 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.890 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096910.8903964, 482cc299-5b06-4501-a819-6556a71a4ad2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.891 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] VM Started (Lifecycle Event)#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.919 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.922 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096910.8934577, 482cc299-5b06-4501-a819-6556a71a4ad2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.922 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.949 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.952 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:55:10 np0005535656 nova_compute[187219]: 2025-11-25 18:55:10.998 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.263 187223 DEBUG nova.compute.manager [req-962ef0b9-9d13-45d1-b7e5-34e92943d12a req-ef52edea-764b-4fed-83d2-1a07ee7c0cfc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.264 187223 DEBUG oslo_concurrency.lockutils [req-962ef0b9-9d13-45d1-b7e5-34e92943d12a req-ef52edea-764b-4fed-83d2-1a07ee7c0cfc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.265 187223 DEBUG oslo_concurrency.lockutils [req-962ef0b9-9d13-45d1-b7e5-34e92943d12a req-ef52edea-764b-4fed-83d2-1a07ee7c0cfc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.265 187223 DEBUG oslo_concurrency.lockutils [req-962ef0b9-9d13-45d1-b7e5-34e92943d12a req-ef52edea-764b-4fed-83d2-1a07ee7c0cfc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.266 187223 DEBUG nova.compute.manager [req-962ef0b9-9d13-45d1-b7e5-34e92943d12a req-ef52edea-764b-4fed-83d2-1a07ee7c0cfc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Processing event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.267 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.272 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096913.2723823, 482cc299-5b06-4501-a819-6556a71a4ad2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.273 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.276 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.281 187223 INFO nova.virt.libvirt.driver [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Instance spawned successfully.#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.281 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.309 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.317 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.318 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.319 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.319 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.320 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.321 187223 DEBUG nova.virt.libvirt.driver [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.328 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.389 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.431 187223 INFO nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Took 9.82 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.433 187223 DEBUG nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.514 187223 INFO nova.compute.manager [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Took 10.70 seconds to build instance.#033[00m
Nov 25 13:55:13 np0005535656 nova_compute[187219]: 2025-11-25 18:55:13.538 187223 DEBUG oslo_concurrency.lockutils [None req-cf23ccad-222d-4460-ac64-57c8642cb20f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:13 np0005535656 podman[209785]: 2025-11-25 18:55:13.97795962 +0000 UTC m=+0.082571788 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 13:55:14 np0005535656 nova_compute[187219]: 2025-11-25 18:55:14.071 187223 DEBUG nova.network.neutron [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Updated VIF entry in instance network info cache for port 9583d685-83d5-480e-a534-d81a55c68f50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:55:14 np0005535656 nova_compute[187219]: 2025-11-25 18:55:14.071 187223 DEBUG nova.network.neutron [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Updating instance_info_cache with network_info: [{"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:55:14 np0005535656 nova_compute[187219]: 2025-11-25 18:55:14.113 187223 DEBUG oslo_concurrency.lockutils [req-1975932f-f412-4762-b62e-c71405a2eccc req-71337d94-361b-4fac-bd89-603625191356 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-482cc299-5b06-4501-a819-6556a71a4ad2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:55:14 np0005535656 nova_compute[187219]: 2025-11-25 18:55:14.227 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:14 np0005535656 nova_compute[187219]: 2025-11-25 18:55:14.879 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.420 187223 DEBUG nova.compute.manager [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.421 187223 DEBUG oslo_concurrency.lockutils [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.422 187223 DEBUG oslo_concurrency.lockutils [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.422 187223 DEBUG oslo_concurrency.lockutils [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.423 187223 DEBUG nova.compute.manager [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] No waiting events found dispatching network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:15 np0005535656 nova_compute[187219]: 2025-11-25 18:55:15.423 187223 WARNING nova.compute.manager [req-0a8ccad4-8faa-4302-8548-d11d38e561f8 req-2315386f-cc25-4785-85bd-dbf518178d16 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received unexpected event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 for instance with vm_state active and task_state None.#033[00m
Nov 25 13:55:19 np0005535656 nova_compute[187219]: 2025-11-25 18:55:19.233 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:55:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:55:19 np0005535656 nova_compute[187219]: 2025-11-25 18:55:19.903 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:21 np0005535656 podman[209811]: 2025-11-25 18:55:21.967386326 +0000 UTC m=+0.072318168 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 13:55:22 np0005535656 podman[209810]: 2025-11-25 18:55:22.000073583 +0000 UTC m=+0.102713131 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:55:24 np0005535656 nova_compute[187219]: 2025-11-25 18:55:24.235 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:24 np0005535656 nova_compute[187219]: 2025-11-25 18:55:24.908 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:26 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:26Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:50:89 10.100.0.5
Nov 25 13:55:26 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:26Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:50:89 10.100.0.5
Nov 25 13:55:28 np0005535656 podman[209872]: 2025-11-25 18:55:28.978413312 +0000 UTC m=+0.088393998 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 25 13:55:29 np0005535656 nova_compute[187219]: 2025-11-25 18:55:29.237 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:29 np0005535656 nova_compute[187219]: 2025-11-25 18:55:29.944 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:31 np0005535656 nova_compute[187219]: 2025-11-25 18:55:31.831 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Check if temp file /var/lib/nova/instances/tmptvz9bd7h exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 13:55:31 np0005535656 nova_compute[187219]: 2025-11-25 18:55:31.832 187223 DEBUG nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptvz9bd7h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='135f8d09-972f-4564-a9cf-74128ae9320a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 13:55:31 np0005535656 podman[209893]: 2025-11-25 18:55:31.969680668 +0000 UTC m=+0.074418054 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.240 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.278 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.359 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.361 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.416 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:34 np0005535656 nova_compute[187219]: 2025-11-25 18:55:34.946 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:35 np0005535656 podman[197580]: time="2025-11-25T18:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:55:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:55:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 25 13:55:37 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 13:55:37 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 13:55:37 np0005535656 systemd-logind[788]: New session 28 of user nova.
Nov 25 13:55:37 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 13:55:37 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 13:55:38 np0005535656 systemd[209923]: Queued start job for default target Main User Target.
Nov 25 13:55:38 np0005535656 systemd[209923]: Created slice User Application Slice.
Nov 25 13:55:38 np0005535656 systemd[209923]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 13:55:38 np0005535656 systemd[209923]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 13:55:38 np0005535656 systemd[209923]: Reached target Paths.
Nov 25 13:55:38 np0005535656 systemd[209923]: Reached target Timers.
Nov 25 13:55:38 np0005535656 systemd[209923]: Starting D-Bus User Message Bus Socket...
Nov 25 13:55:38 np0005535656 systemd[209923]: Starting Create User's Volatile Files and Directories...
Nov 25 13:55:38 np0005535656 systemd[209923]: Finished Create User's Volatile Files and Directories.
Nov 25 13:55:38 np0005535656 systemd[209923]: Listening on D-Bus User Message Bus Socket.
Nov 25 13:55:38 np0005535656 systemd[209923]: Reached target Sockets.
Nov 25 13:55:38 np0005535656 systemd[209923]: Reached target Basic System.
Nov 25 13:55:38 np0005535656 systemd[209923]: Reached target Main User Target.
Nov 25 13:55:38 np0005535656 systemd[209923]: Startup finished in 137ms.
Nov 25 13:55:38 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 13:55:38 np0005535656 systemd[1]: Started Session 28 of User nova.
Nov 25 13:55:38 np0005535656 systemd[1]: session-28.scope: Deactivated successfully.
Nov 25 13:55:38 np0005535656 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Nov 25 13:55:38 np0005535656 systemd-logind[788]: Removed session 28.
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.243 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.837 187223 DEBUG nova.compute.manager [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.837 187223 DEBUG oslo_concurrency.lockutils [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.837 187223 DEBUG oslo_concurrency.lockutils [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.838 187223 DEBUG oslo_concurrency.lockutils [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.838 187223 DEBUG nova.compute.manager [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.838 187223 DEBUG nova.compute.manager [req-1a85a15b-d612-423d-852a-cfad1fce5699 req-ae71ac18-a5ae-4a91-bc62-1676b060da59 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:55:39 np0005535656 nova_compute[187219]: 2025-11-25 18:55:39.949 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:40 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:40Z|00048|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.532 187223 INFO nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Took 7.11 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.533 187223 DEBUG nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.560 187223 DEBUG nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptvz9bd7h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='135f8d09-972f-4564-a9cf-74128ae9320a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c0374118-0552-4ee5-81cc-422249921a5f),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.586 187223 DEBUG nova.objects.instance [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 135f8d09-972f-4564-a9cf-74128ae9320a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.589 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.592 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.592 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.623 187223 DEBUG nova.virt.libvirt.vif [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:53:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2108820933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2108820933',id=3,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:54:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-t4hk0dbk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:54:00Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=135f8d09-972f-4564-a9cf-74128ae9320a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.623 187223 DEBUG nova.network.os_vif_util [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.624 187223 DEBUG nova.network.os_vif_util [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.625 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 13:55:41 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:10:6f:c4"/>
Nov 25 13:55:41 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 13:55:41 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:55:41 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 13:55:41 np0005535656 nova_compute[187219]:  <target dev="tap906ded83-fa"/>
Nov 25 13:55:41 np0005535656 nova_compute[187219]: </interface>
Nov 25 13:55:41 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.625 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.996 187223 DEBUG nova.compute.manager [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.996 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.996 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.996 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.997 187223 DEBUG nova.compute.manager [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.997 187223 WARNING nova.compute.manager [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state migrating.#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.997 187223 DEBUG nova.compute.manager [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-changed-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.997 187223 DEBUG nova.compute.manager [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Refreshing instance network info cache due to event network-changed-906ded83-fa3f-44e8-a187-2d7233b49cba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.997 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.998 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:55:41 np0005535656 nova_compute[187219]: 2025-11-25 18:55:41.998 187223 DEBUG nova.network.neutron [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Refreshing network info cache for port 906ded83-fa3f-44e8-a187-2d7233b49cba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:55:42 np0005535656 nova_compute[187219]: 2025-11-25 18:55:42.096 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 13:55:42 np0005535656 nova_compute[187219]: 2025-11-25 18:55:42.096 187223 INFO nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 13:55:42 np0005535656 nova_compute[187219]: 2025-11-25 18:55:42.675 187223 INFO nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 13:55:43 np0005535656 nova_compute[187219]: 2025-11-25 18:55:43.179 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 13:55:43 np0005535656 nova_compute[187219]: 2025-11-25 18:55:43.179 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 13:55:43 np0005535656 nova_compute[187219]: 2025-11-25 18:55:43.726 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 13:55:43 np0005535656 nova_compute[187219]: 2025-11-25 18:55:43.726 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.229 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.230 187223 DEBUG nova.virt.libvirt.migration [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.247 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.471 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764096944.4705791, 135f8d09-972f-4564-a9cf-74128ae9320a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.471 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.527 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.532 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:55:44 np0005535656 kernel: tap906ded83-fa (unregistering): left promiscuous mode
Nov 25 13:55:44 np0005535656 NetworkManager[55548]: <info>  [1764096944.6668] device (tap906ded83-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:55:44 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:44Z|00049|binding|INFO|Releasing lport 906ded83-fa3f-44e8-a187-2d7233b49cba from this chassis (sb_readonly=0)
Nov 25 13:55:44 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:44Z|00050|binding|INFO|Setting lport 906ded83-fa3f-44e8-a187-2d7233b49cba down in Southbound
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.678 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 ovn_controller[95460]: 2025-11-25T18:55:44Z|00051|binding|INFO|Removing iface tap906ded83-fa ovn-installed in OVS
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.680 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.681 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.691 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 25 13:55:44 np0005535656 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 20.297s CPU time.
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.718 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:6f:c4 10.100.0.10'], port_security=['fa:16:3e:10:6f:c4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '135f8d09-972f-4564-a9cf-74128ae9320a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=906ded83-fa3f-44e8-a187-2d7233b49cba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:55:44 np0005535656 systemd-machined[153481]: Machine qemu-2-instance-00000003 terminated.
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.719 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 906ded83-fa3f-44e8-a187-2d7233b49cba in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 unbound from our chassis#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.720 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.747 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[36c2af5f-3a74-4c45-99c7-5fe2a6f95f59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 podman[209961]: 2025-11-25 18:55:44.773358971 +0000 UTC m=+0.068799760 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.777 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[17fc474d-fb48-425e-bfe4-990098810d2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.781 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[d094e74a-32cd-41a4-8999-c3ca9541b82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.811 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f9ad3f-30d5-4a29-8a7f-3896e5ce1f3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.827 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[686ba2bc-e825-4a27-96ea-bf70f1722b23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209997, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.840 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea8688b-1f87-452e-9de7-67474ac9c60b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379997, 'tstamp': 379997}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209999, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379999, 'tstamp': 379999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209999, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.842 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.843 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.848 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.848 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.849 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.849 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:44.850 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.866 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.871 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.892 187223 DEBUG nova.virt.libvirt.guest [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.893 187223 INFO nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migration operation has completed#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.893 187223 INFO nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] _post_live_migration() is started..#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.898 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.898 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.899 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 13:55:44 np0005535656 nova_compute[187219]: 2025-11-25 18:55:44.951 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.728 187223 DEBUG nova.compute.manager [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.728 187223 DEBUG oslo_concurrency.lockutils [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.729 187223 DEBUG oslo_concurrency.lockutils [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.729 187223 DEBUG oslo_concurrency.lockutils [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.730 187223 DEBUG nova.compute.manager [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.730 187223 DEBUG nova.compute.manager [req-cd720818-106b-4b0a-91fc-af694d5fe650 req-c6b2db0c-83c7-432d-9408-8225c7584a04 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.793 187223 DEBUG nova.network.neutron [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updated VIF entry in instance network info cache for port 906ded83-fa3f-44e8-a187-2d7233b49cba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.794 187223 DEBUG nova.network.neutron [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Updating instance_info_cache with network_info: [{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:55:45 np0005535656 nova_compute[187219]: 2025-11-25 18:55:45.834 187223 DEBUG oslo_concurrency.lockutils [req-77dbb361-a51d-4b21-b0ad-4aeeb6e677b2 req-1966513f-eab2-4754-961d-b614015fbc78 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-135f8d09-972f-4564-a9cf-74128ae9320a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.637 187223 DEBUG nova.compute.manager [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.637 187223 DEBUG oslo_concurrency.lockutils [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.638 187223 DEBUG oslo_concurrency.lockutils [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.638 187223 DEBUG oslo_concurrency.lockutils [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.638 187223 DEBUG nova.compute.manager [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.638 187223 DEBUG nova.compute.manager [req-d55da2dc-f9ff-444c-a8f4-1868928fa8a8 req-fe7c5968-7620-4536-bf94-c39a41aea292 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-unplugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:55:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:46.720 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:55:46 np0005535656 nova_compute[187219]: 2025-11-25 18:55:46.721 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:46.722 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.335 187223 DEBUG nova.network.neutron [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 906ded83-fa3f-44e8-a187-2d7233b49cba and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.335 187223 DEBUG nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.336 187223 DEBUG nova.virt.libvirt.vif [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:53:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2108820933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2108820933',id=3,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:54:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-t4hk0dbk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:55:26Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=135f8d09-972f-4564-a9cf-74128ae9320a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.336 187223 DEBUG nova.network.os_vif_util [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "906ded83-fa3f-44e8-a187-2d7233b49cba", "address": "fa:16:3e:10:6f:c4", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap906ded83-fa", "ovs_interfaceid": "906ded83-fa3f-44e8-a187-2d7233b49cba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.337 187223 DEBUG nova.network.os_vif_util [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.337 187223 DEBUG os_vif [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.339 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.339 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap906ded83-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.342 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.345 187223 INFO os_vif [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:6f:c4,bridge_name='br-int',has_traffic_filtering=True,id=906ded83-fa3f-44e8-a187-2d7233b49cba,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap906ded83-fa')#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.345 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.346 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.346 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.346 187223 DEBUG nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.347 187223 INFO nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Deleting instance files /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a_del#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.347 187223 INFO nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Deletion of /var/lib/nova/instances/135f8d09-972f-4564-a9cf-74128ae9320a_del complete#033[00m
Nov 25 13:55:47 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:47.724 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.889 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.890 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.890 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.890 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.890 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.890 187223 WARNING nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state migrating.#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.891 187223 WARNING nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state migrating.#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 DEBUG oslo_concurrency.lockutils [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 DEBUG nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:47 np0005535656 nova_compute[187219]: 2025-11-25 18:55:47.892 187223 WARNING nova.compute.manager [req-a0c2e5b7-97d6-44f2-aa7f-ace1d9c1fcd5 req-63985b9f-f1c0-4c46-8cda-02a9bd2dff6e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state migrating.#033[00m
Nov 25 13:55:48 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 13:55:48 np0005535656 systemd[209923]: Activating special unit Exit the Session...
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped target Main User Target.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped target Basic System.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped target Paths.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped target Sockets.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped target Timers.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 13:55:48 np0005535656 systemd[209923]: Closed D-Bus User Message Bus Socket.
Nov 25 13:55:48 np0005535656 systemd[209923]: Stopped Create User's Volatile Files and Directories.
Nov 25 13:55:48 np0005535656 systemd[209923]: Removed slice User Application Slice.
Nov 25 13:55:48 np0005535656 systemd[209923]: Reached target Shutdown.
Nov 25 13:55:48 np0005535656 systemd[209923]: Finished Exit the Session.
Nov 25 13:55:48 np0005535656 systemd[209923]: Reached target Exit the Session.
Nov 25 13:55:48 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 13:55:48 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 13:55:48 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 13:55:48 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 13:55:48 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 13:55:48 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 13:55:48 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.264 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.265 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.265 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:55:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:55:49 np0005535656 nova_compute[187219]: 2025-11-25 18:55:49.952 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.875 187223 DEBUG nova.compute.manager [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.875 187223 DEBUG oslo_concurrency.lockutils [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.876 187223 DEBUG oslo_concurrency.lockutils [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.876 187223 DEBUG oslo_concurrency.lockutils [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.877 187223 DEBUG nova.compute.manager [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] No waiting events found dispatching network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:55:50 np0005535656 nova_compute[187219]: 2025-11-25 18:55:50.877 187223 WARNING nova.compute.manager [req-8221d462-f45c-4740-bb13-f4224fdac30a req-902d0381-d4f6-4359-9c8f-276b3881e697 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Received unexpected event network-vif-plugged-906ded83-fa3f-44e8-a187-2d7233b49cba for instance with vm_state active and task_state migrating.#033[00m
Nov 25 13:55:51 np0005535656 nova_compute[187219]: 2025-11-25 18:55:51.393 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:55:51 np0005535656 nova_compute[187219]: 2025-11-25 18:55:51.393 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:55:51 np0005535656 nova_compute[187219]: 2025-11-25 18:55:51.394 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 13:55:51 np0005535656 nova_compute[187219]: 2025-11-25 18:55:51.394 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid af1b51b4-9c51-443a-932e-a48750d61085 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:55:52 np0005535656 nova_compute[187219]: 2025-11-25 18:55:52.343 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:53 np0005535656 podman[210020]: 2025-11-25 18:55:53.000891593 +0000 UTC m=+0.106795284 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 13:55:53 np0005535656 podman[210019]: 2025-11-25 18:55:53.042472475 +0000 UTC m=+0.148754097 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 13:55:54 np0005535656 nova_compute[187219]: 2025-11-25 18:55:54.955 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.434 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updating instance_info_cache with network_info: [{"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.483 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-af1b51b4-9c51-443a-932e-a48750d61085" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.483 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.484 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.484 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.484 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.484 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.485 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.508 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.508 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.508 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.508 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.619 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.686 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.687 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.738 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.743 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.804 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.805 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:56 np0005535656 nova_compute[187219]: 2025-11-25 18:55:56.874 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.035 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.036 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5573MB free_disk=73.11032104492188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.036 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.036 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.153 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Migration for instance 135f8d09-972f-4564-a9cf-74128ae9320a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.210 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.269 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 482cc299-5b06-4501-a819-6556a71a4ad2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.269 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance af1b51b4-9c51-443a-932e-a48750d61085 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.270 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Migration c0374118-0552-4ee5-81cc-422249921a5f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.270 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.270 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.346 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.497 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.520 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.590 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.590 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.898 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.898 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.898 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "135f8d09-972f-4564-a9cf-74128ae9320a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.945 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.945 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.946 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:57 np0005535656 nova_compute[187219]: 2025-11-25 18:55:57.946 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.107 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.164 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.165 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.227 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.234 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.288 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.289 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.359 187223 DEBUG oslo_concurrency.processutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.550 187223 WARNING nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.553 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5556MB free_disk=73.11030197143555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.554 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.554 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.653 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance 135f8d09-972f-4564-a9cf-74128ae9320a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.685 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.734 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Instance 482cc299-5b06-4501-a819-6556a71a4ad2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.735 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Instance af1b51b4-9c51-443a-932e-a48750d61085 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.735 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration c0374118-0552-4ee5-81cc-422249921a5f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.735 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.735 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.778 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.903 187223 DEBUG nova.compute.provider_tree [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.962 187223 DEBUG nova.scheduler.client.report [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.965 187223 DEBUG nova.compute.resource_tracker [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.966 187223 DEBUG oslo_concurrency.lockutils [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:58 np0005535656 nova_compute[187219]: 2025-11-25 18:55:58.975 187223 INFO nova.compute.manager [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 13:55:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:59.069 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:55:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:59.070 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:55:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:55:59.071 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:55:59 np0005535656 nova_compute[187219]: 2025-11-25 18:55:59.893 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764096944.8920982, 135f8d09-972f-4564-a9cf-74128ae9320a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:55:59 np0005535656 nova_compute[187219]: 2025-11-25 18:55:59.894 187223 INFO nova.compute.manager [-] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:55:59 np0005535656 podman[210088]: 2025-11-25 18:55:59.93986845 +0000 UTC m=+0.062396405 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 25 13:55:59 np0005535656 nova_compute[187219]: 2025-11-25 18:55:59.957 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:01 np0005535656 nova_compute[187219]: 2025-11-25 18:56:01.877 187223 DEBUG nova.compute.manager [None req-99e4aee0-0b9b-4711-8b3e-6a59007e4ee3 - - - - - -] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:56:02 np0005535656 nova_compute[187219]: 2025-11-25 18:56:02.019 187223 INFO nova.scheduler.client.report [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration c0374118-0552-4ee5-81cc-422249921a5f#033[00m
Nov 25 13:56:02 np0005535656 nova_compute[187219]: 2025-11-25 18:56:02.019 187223 DEBUG nova.virt.libvirt.driver [None req-b3528e87-b78b-40b9-8818-62d33542528a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 135f8d09-972f-4564-a9cf-74128ae9320a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 13:56:02 np0005535656 nova_compute[187219]: 2025-11-25 18:56:02.351 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:02 np0005535656 podman[210110]: 2025-11-25 18:56:02.927163068 +0000 UTC m=+0.051941098 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.143 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.144 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.144 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.144 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.144 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.145 187223 INFO nova.compute.manager [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Terminating instance#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.146 187223 DEBUG nova.compute.manager [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 13:56:03 np0005535656 kernel: tap9583d685-83 (unregistering): left promiscuous mode
Nov 25 13:56:03 np0005535656 NetworkManager[55548]: <info>  [1764096963.1746] device (tap9583d685-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:56:03 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:03Z|00052|binding|INFO|Releasing lport 9583d685-83d5-480e-a534-d81a55c68f50 from this chassis (sb_readonly=0)
Nov 25 13:56:03 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:03Z|00053|binding|INFO|Setting lport 9583d685-83d5-480e-a534-d81a55c68f50 down in Southbound
Nov 25 13:56:03 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:03Z|00054|binding|INFO|Removing iface tap9583d685-83 ovn-installed in OVS
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.224 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.226 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.235 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:50:89 10.100.0.5'], port_security=['fa:16:3e:40:50:89 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '482cc299-5b06-4501-a819-6556a71a4ad2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9583d685-83d5-480e-a534-d81a55c68f50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.235 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.236 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9583d685-83d5-480e-a534-d81a55c68f50 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 unbound from our chassis#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.237 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.257 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7aac3b-d8f5-4bc7-9df6-67c3d39a14ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 25 13:56:03 np0005535656 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 14.822s CPU time.
Nov 25 13:56:03 np0005535656 systemd-machined[153481]: Machine qemu-4-instance-00000006 terminated.
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.294 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[628996b4-cf69-4abb-833c-60e5811e18cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.297 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[371b9e78-74ce-4a02-858e-1d05b447c835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.334 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[a92ec166-3174-41e1-88a5-400e3021432a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.349 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2e58cb-6a28-4a4b-9d05-59866e502c1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe81e455-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:a2:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379988, 'reachable_time': 34946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210142, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.365 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[632f1136-1bc1-4039-abda-550aed0db43f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379997, 'tstamp': 379997}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210144, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe81e455-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379999, 'tstamp': 379999}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210144, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.366 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.368 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.371 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.372 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe81e455-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.372 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.373 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe81e455-40, col_values=(('external_ids', {'iface-id': '035fc4d6-bdf9-4495-a5a8-2c835f3dfc48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:03 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:03.373 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.399 187223 INFO nova.virt.libvirt.driver [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Instance destroyed successfully.#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.400 187223 DEBUG nova.objects.instance [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'resources' on Instance uuid 482cc299-5b06-4501-a819-6556a71a4ad2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.416 187223 DEBUG nova.virt.libvirt.vif [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:54:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-731786201',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-731786201',id=6,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-0w2a48af',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:55:13Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=482cc299-5b06-4501-a819-6556a71a4ad2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.416 187223 DEBUG nova.network.os_vif_util [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9583d685-83d5-480e-a534-d81a55c68f50", "address": "fa:16:3e:40:50:89", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9583d685-83", "ovs_interfaceid": "9583d685-83d5-480e-a534-d81a55c68f50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.417 187223 DEBUG nova.network.os_vif_util [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.417 187223 DEBUG os_vif [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.418 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.418 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9583d685-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.419 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.421 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.422 187223 INFO os_vif [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:50:89,bridge_name='br-int',has_traffic_filtering=True,id=9583d685-83d5-480e-a534-d81a55c68f50,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9583d685-83')#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.423 187223 INFO nova.virt.libvirt.driver [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Deleting instance files /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2_del#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.424 187223 INFO nova.virt.libvirt.driver [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Deletion of /var/lib/nova/instances/482cc299-5b06-4501-a819-6556a71a4ad2_del complete#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.533 187223 INFO nova.compute.manager [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.534 187223 DEBUG oslo.service.loopingcall [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.534 187223 DEBUG nova.compute.manager [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 13:56:03 np0005535656 nova_compute[187219]: 2025-11-25 18:56:03.535 187223 DEBUG nova.network.neutron [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 13:56:04 np0005535656 nova_compute[187219]: 2025-11-25 18:56:04.960 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:05 np0005535656 podman[197580]: time="2025-11-25T18:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:56:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:56:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.880 187223 DEBUG nova.compute.manager [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-unplugged-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.880 187223 DEBUG oslo_concurrency.lockutils [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.881 187223 DEBUG oslo_concurrency.lockutils [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.881 187223 DEBUG oslo_concurrency.lockutils [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.881 187223 DEBUG nova.compute.manager [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] No waiting events found dispatching network-vif-unplugged-9583d685-83d5-480e-a534-d81a55c68f50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:56:06 np0005535656 nova_compute[187219]: 2025-11-25 18:56:06.881 187223 DEBUG nova.compute.manager [req-8f19f9e7-8564-49f0-bdff-4d6deb4953e8 req-97617cde-4aab-4b7f-9fba-9a904dfaf84d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-unplugged-9583d685-83d5-480e-a534-d81a55c68f50 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.070 187223 DEBUG nova.network.neutron [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.314 187223 INFO nova.compute.manager [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Took 3.78 seconds to deallocate network for instance.#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.454 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.454 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.565 187223 DEBUG nova.compute.provider_tree [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.583 187223 DEBUG nova.scheduler.client.report [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.637 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.692 187223 INFO nova.scheduler.client.report [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Deleted allocations for instance 482cc299-5b06-4501-a819-6556a71a4ad2#033[00m
Nov 25 13:56:07 np0005535656 nova_compute[187219]: 2025-11-25 18:56:07.785 187223 DEBUG oslo_concurrency.lockutils [None req-bbe1904d-7115-4579-8e2e-d5ab6a131560 be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:08 np0005535656 nova_compute[187219]: 2025-11-25 18:56:08.420 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.103 187223 DEBUG nova.compute.manager [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.104 187223 DEBUG oslo_concurrency.lockutils [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.104 187223 DEBUG oslo_concurrency.lockutils [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.105 187223 DEBUG oslo_concurrency.lockutils [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "482cc299-5b06-4501-a819-6556a71a4ad2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.105 187223 DEBUG nova.compute.manager [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] No waiting events found dispatching network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.105 187223 WARNING nova.compute.manager [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received unexpected event network-vif-plugged-9583d685-83d5-480e-a534-d81a55c68f50 for instance with vm_state deleted and task_state None.#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.106 187223 DEBUG nova.compute.manager [req-e628d3b9-a496-4907-aa64-d88f59270f72 req-a156cb09-9dc9-4670-ba99-e6f880a2377e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Received event network-vif-deleted-9583d685-83d5-480e-a534-d81a55c68f50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.884 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.884 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.885 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.885 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.885 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.886 187223 INFO nova.compute.manager [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Terminating instance#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.887 187223 DEBUG nova.compute.manager [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 13:56:09 np0005535656 kernel: tap9274e936-36 (unregistering): left promiscuous mode
Nov 25 13:56:09 np0005535656 NetworkManager[55548]: <info>  [1764096969.9136] device (tap9274e936-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:56:09 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:09Z|00055|binding|INFO|Releasing lport 9274e936-3662-499c-89b5-4b605917aad2 from this chassis (sb_readonly=0)
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.923 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:09 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:09Z|00056|binding|INFO|Setting lport 9274e936-3662-499c-89b5-4b605917aad2 down in Southbound
Nov 25 13:56:09 np0005535656 ovn_controller[95460]: 2025-11-25T18:56:09Z|00057|binding|INFO|Removing iface tap9274e936-36 ovn-installed in OVS
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.926 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.939 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:09.947 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:9a:75 10.100.0.13'], port_security=['fa:16:3e:35:9a:75 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'af1b51b4-9c51-443a-932e-a48750d61085', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90f5f32749934e1bb4a31b5643dc964a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3235d006-85b4-4c07-966c-48d4df16258d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dde4be2a-475e-47e2-8532-faebae80eb26, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9274e936-3662-499c-89b5-4b605917aad2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:56:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:09.948 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9274e936-3662-499c-89b5-4b605917aad2 in datapath fe81e455-495f-4aea-8dd6-8b6f8cf5d198 unbound from our chassis#033[00m
Nov 25 13:56:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:09.949 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 13:56:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:09.950 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab9e0f7-f382-4161-8be6-844d6bcab851]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:09.950 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 namespace which is not needed anymore#033[00m
Nov 25 13:56:09 np0005535656 nova_compute[187219]: 2025-11-25 18:56:09.962 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:09 np0005535656 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 25 13:56:09 np0005535656 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 16.954s CPU time.
Nov 25 13:56:09 np0005535656 systemd-machined[153481]: Machine qemu-3-instance-00000005 terminated.
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.112 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.119 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [NOTICE]   (209377) : haproxy version is 2.8.14-c23fe91
Nov 25 13:56:10 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [NOTICE]   (209377) : path to executable is /usr/sbin/haproxy
Nov 25 13:56:10 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [WARNING]  (209377) : Exiting Master process...
Nov 25 13:56:10 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [ALERT]    (209377) : Current worker (209380) exited with code 143 (Terminated)
Nov 25 13:56:10 np0005535656 neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198[209369]: [WARNING]  (209377) : All workers exited. Exiting... (0)
Nov 25 13:56:10 np0005535656 systemd[1]: libpod-b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2.scope: Deactivated successfully.
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.155 187223 INFO nova.virt.libvirt.driver [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Instance destroyed successfully.#033[00m
Nov 25 13:56:10 np0005535656 podman[210200]: 2025-11-25 18:56:10.156251463 +0000 UTC m=+0.077578542 container died b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.155 187223 DEBUG nova.objects.instance [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lazy-loading 'resources' on Instance uuid af1b51b4-9c51-443a-932e-a48750d61085 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:56:10 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2-userdata-shm.mount: Deactivated successfully.
Nov 25 13:56:10 np0005535656 systemd[1]: var-lib-containers-storage-overlay-f7dfd84f56621b9dd0dbf7e3cdfffa00a959b180e039a39f51faffae8371e203-merged.mount: Deactivated successfully.
Nov 25 13:56:10 np0005535656 podman[210200]: 2025-11-25 18:56:10.290343285 +0000 UTC m=+0.211670364 container cleanup b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:56:10 np0005535656 systemd[1]: libpod-conmon-b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2.scope: Deactivated successfully.
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.431 187223 DEBUG nova.virt.libvirt.vif [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:54:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-535417570',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-535417570',id=5,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:54:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90f5f32749934e1bb4a31b5643dc964a',ramdisk_id='',reservation_id='r-5b4iwpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-53937300',owner_user_name='tempest-TestExecuteActionsViaActuator-53937300-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:54:50Z,user_data=None,user_id='be3c7719092245a3b39ec72ada0c5247',uuid=af1b51b4-9c51-443a-932e-a48750d61085,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.431 187223 DEBUG nova.network.os_vif_util [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converting VIF {"id": "9274e936-3662-499c-89b5-4b605917aad2", "address": "fa:16:3e:35:9a:75", "network": {"id": "fe81e455-495f-4aea-8dd6-8b6f8cf5d198", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1686246027-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90f5f32749934e1bb4a31b5643dc964a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9274e936-36", "ovs_interfaceid": "9274e936-3662-499c-89b5-4b605917aad2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.432 187223 DEBUG nova.network.os_vif_util [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.433 187223 DEBUG os_vif [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.435 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.435 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9274e936-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.436 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.438 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.440 187223 INFO os_vif [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:9a:75,bridge_name='br-int',has_traffic_filtering=True,id=9274e936-3662-499c-89b5-4b605917aad2,network=Network(fe81e455-495f-4aea-8dd6-8b6f8cf5d198),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9274e936-36')#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.441 187223 INFO nova.virt.libvirt.driver [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Deleting instance files /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085_del#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.441 187223 INFO nova.virt.libvirt.driver [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Deletion of /var/lib/nova/instances/af1b51b4-9c51-443a-932e-a48750d61085_del complete#033[00m
Nov 25 13:56:10 np0005535656 podman[210246]: 2025-11-25 18:56:10.452395165 +0000 UTC m=+0.136593911 container remove b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.458 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[63ee3c1c-db0d-4f96-8c6d-7faebeeff874]: (4, ('Tue Nov 25 06:56:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 (b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2)\nb3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2\nTue Nov 25 06:56:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 (b3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2)\nb3ac7339aa0cede18b25d877c0b0c2fc5f8df4cef42cac732df51258258cc1d2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.460 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3d10932e-e496-4820-a519-17008227eed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.461 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe81e455-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.462 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 kernel: tapfe81e455-40: left promiscuous mode
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.487 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.490 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[266f2110-3298-418c-b45a-1ee1ace5ee22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.505 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4d851893-397c-4bda-b81e-4ae4a99798ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.507 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1f90f015-3aa6-4d05-b207-b4af7cce317b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.522 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d3384c35-2550-4f1a-b144-9b730c590eff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379981, 'reachable_time': 15848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210261, 'error': None, 'target': 'ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 systemd[1]: run-netns-ovnmeta\x2dfe81e455\x2d495f\x2d4aea\x2d8dd6\x2d8b6f8cf5d198.mount: Deactivated successfully.
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.525 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe81e455-495f-4aea-8dd6-8b6f8cf5d198 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 13:56:10 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:10.525 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[e0995836-be03-40f9-a31d-a0537f1b27fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.566 187223 INFO nova.compute.manager [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.567 187223 DEBUG oslo.service.loopingcall [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.568 187223 DEBUG nova.compute.manager [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.568 187223 DEBUG nova.network.neutron [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.851 187223 DEBUG nova.compute.manager [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-unplugged-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.851 187223 DEBUG oslo_concurrency.lockutils [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.852 187223 DEBUG oslo_concurrency.lockutils [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.852 187223 DEBUG oslo_concurrency.lockutils [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.852 187223 DEBUG nova.compute.manager [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] No waiting events found dispatching network-vif-unplugged-9274e936-3662-499c-89b5-4b605917aad2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:56:10 np0005535656 nova_compute[187219]: 2025-11-25 18:56:10.852 187223 DEBUG nova.compute.manager [req-d1165e9a-f2e4-420e-9fc4-9f784aa5cf8b req-fcbef867-bd38-4843-b074-f178702036f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-unplugged-9274e936-3662-499c-89b5-4b605917aad2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.573 187223 DEBUG nova.network.neutron [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.718 187223 INFO nova.compute.manager [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Took 2.15 seconds to deallocate network for instance.#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.817 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.818 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.926 187223 DEBUG nova.compute.provider_tree [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:56:12 np0005535656 nova_compute[187219]: 2025-11-25 18:56:12.959 187223 DEBUG nova.scheduler.client.report [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.047 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.099 187223 DEBUG nova.compute.manager [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.100 187223 DEBUG oslo_concurrency.lockutils [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "af1b51b4-9c51-443a-932e-a48750d61085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.100 187223 DEBUG oslo_concurrency.lockutils [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.101 187223 DEBUG oslo_concurrency.lockutils [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.101 187223 DEBUG nova.compute.manager [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] No waiting events found dispatching network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.102 187223 WARNING nova.compute.manager [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received unexpected event network-vif-plugged-9274e936-3662-499c-89b5-4b605917aad2 for instance with vm_state deleted and task_state None.#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.102 187223 DEBUG nova.compute.manager [req-8a67ec16-f92f-4d68-a4eb-03a5af64a62a req-fc06d13d-3db8-42a4-a3c4-636452f64df5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Received event network-vif-deleted-9274e936-3662-499c-89b5-4b605917aad2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.181 187223 INFO nova.scheduler.client.report [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Deleted allocations for instance af1b51b4-9c51-443a-932e-a48750d61085#033[00m
Nov 25 13:56:13 np0005535656 nova_compute[187219]: 2025-11-25 18:56:13.356 187223 DEBUG oslo_concurrency.lockutils [None req-a4af4689-2400-4443-9c04-c08f2939070f be3c7719092245a3b39ec72ada0c5247 90f5f32749934e1bb4a31b5643dc964a - - default default] Lock "af1b51b4-9c51-443a-932e-a48750d61085" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:15 np0005535656 nova_compute[187219]: 2025-11-25 18:56:15.010 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:15 np0005535656 podman[210262]: 2025-11-25 18:56:15.039387522 +0000 UTC m=+0.139353848 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 13:56:15 np0005535656 nova_compute[187219]: 2025-11-25 18:56:15.437 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:18 np0005535656 nova_compute[187219]: 2025-11-25 18:56:18.398 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764096963.3975039, 482cc299-5b06-4501-a819-6556a71a4ad2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:56:18 np0005535656 nova_compute[187219]: 2025-11-25 18:56:18.399 187223 INFO nova.compute.manager [-] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:56:18 np0005535656 nova_compute[187219]: 2025-11-25 18:56:18.476 187223 DEBUG nova.compute.manager [None req-99b74c47-f133-4935-a117-070454bc28cb - - - - - -] [instance: 482cc299-5b06-4501-a819-6556a71a4ad2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:56:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:56:20 np0005535656 nova_compute[187219]: 2025-11-25 18:56:20.054 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:20 np0005535656 nova_compute[187219]: 2025-11-25 18:56:20.439 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:23 np0005535656 podman[210288]: 2025-11-25 18:56:23.969254171 +0000 UTC m=+0.087832043 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:56:23 np0005535656 podman[210287]: 2025-11-25 18:56:23.991242415 +0000 UTC m=+0.104089310 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 13:56:25 np0005535656 nova_compute[187219]: 2025-11-25 18:56:25.055 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:25 np0005535656 nova_compute[187219]: 2025-11-25 18:56:25.153 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764096970.151736, af1b51b4-9c51-443a-932e-a48750d61085 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:56:25 np0005535656 nova_compute[187219]: 2025-11-25 18:56:25.154 187223 INFO nova.compute.manager [-] [instance: af1b51b4-9c51-443a-932e-a48750d61085] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:56:25 np0005535656 nova_compute[187219]: 2025-11-25 18:56:25.206 187223 DEBUG nova.compute.manager [None req-d5c5836d-febe-4c2f-a6a1-c6040ff1e7ff - - - - - -] [instance: af1b51b4-9c51-443a-932e-a48750d61085] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:56:25 np0005535656 nova_compute[187219]: 2025-11-25 18:56:25.440 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:30 np0005535656 nova_compute[187219]: 2025-11-25 18:56:30.057 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:30 np0005535656 nova_compute[187219]: 2025-11-25 18:56:30.442 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:30 np0005535656 podman[210334]: 2025-11-25 18:56:30.980603376 +0000 UTC m=+0.095390862 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 25 13:56:33 np0005535656 podman[210355]: 2025-11-25 18:56:33.927832441 +0000 UTC m=+0.054481837 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:56:35 np0005535656 nova_compute[187219]: 2025-11-25 18:56:35.061 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:35 np0005535656 nova_compute[187219]: 2025-11-25 18:56:35.445 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:35 np0005535656 podman[197580]: time="2025-11-25T18:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:56:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:56:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 25 13:56:39 np0005535656 nova_compute[187219]: 2025-11-25 18:56:39.629 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:40 np0005535656 nova_compute[187219]: 2025-11-25 18:56:40.063 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:40 np0005535656 nova_compute[187219]: 2025-11-25 18:56:40.446 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:45 np0005535656 nova_compute[187219]: 2025-11-25 18:56:45.066 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:45 np0005535656 nova_compute[187219]: 2025-11-25 18:56:45.447 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:45 np0005535656 podman[210377]: 2025-11-25 18:56:45.974900681 +0000 UTC m=+0.082060411 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 13:56:47 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:47.453 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:56:47 np0005535656 nova_compute[187219]: 2025-11-25 18:56:47.454 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:47 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:47.455 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:56:47 np0005535656 nova_compute[187219]: 2025-11-25 18:56:47.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:48 np0005535656 nova_compute[187219]: 2025-11-25 18:56:48.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:48 np0005535656 nova_compute[187219]: 2025-11-25 18:56:48.706 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:56:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:56:49 np0005535656 nova_compute[187219]: 2025-11-25 18:56:49.706 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:50 np0005535656 nova_compute[187219]: 2025-11-25 18:56:50.068 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:50 np0005535656 nova_compute[187219]: 2025-11-25 18:56:50.449 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:51 np0005535656 nova_compute[187219]: 2025-11-25 18:56:51.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:51 np0005535656 nova_compute[187219]: 2025-11-25 18:56:51.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:56:51 np0005535656 nova_compute[187219]: 2025-11-25 18:56:51.877 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:56:52 np0005535656 nova_compute[187219]: 2025-11-25 18:56:52.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:52 np0005535656 nova_compute[187219]: 2025-11-25 18:56:52.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:52 np0005535656 nova_compute[187219]: 2025-11-25 18:56:52.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:52 np0005535656 nova_compute[187219]: 2025-11-25 18:56:52.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:56:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:53.457 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:56:54 np0005535656 podman[210405]: 2025-11-25 18:56:54.944817808 +0000 UTC m=+0.050477965 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 13:56:54 np0005535656 podman[210404]: 2025-11-25 18:56:54.978280045 +0000 UTC m=+0.093199020 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.069 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.470 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.841 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.842 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.842 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:55 np0005535656 nova_compute[187219]: 2025-11-25 18:56:55.842 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.083 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.085 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5892MB free_disk=73.16788101196289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.085 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.086 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.180 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.181 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.262 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.380 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.477 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:56:56 np0005535656 nova_compute[187219]: 2025-11-25 18:56:56.478 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:56:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:59.070 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:56:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:59.070 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:56:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:56:59.071 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:00 np0005535656 nova_compute[187219]: 2025-11-25 18:57:00.073 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:00 np0005535656 nova_compute[187219]: 2025-11-25 18:57:00.471 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:00 np0005535656 nova_compute[187219]: 2025-11-25 18:57:00.477 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:01 np0005535656 podman[210452]: 2025-11-25 18:57:01.976187673 +0000 UTC m=+0.095823099 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 25 13:57:04 np0005535656 podman[210474]: 2025-11-25 18:57:04.933661181 +0000 UTC m=+0.058291824 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 13:57:05 np0005535656 nova_compute[187219]: 2025-11-25 18:57:05.075 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:05 np0005535656 nova_compute[187219]: 2025-11-25 18:57:05.473 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:05 np0005535656 podman[197580]: time="2025-11-25T18:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:57:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:57:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Nov 25 13:57:10 np0005535656 nova_compute[187219]: 2025-11-25 18:57:10.077 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:10 np0005535656 nova_compute[187219]: 2025-11-25 18:57:10.476 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:15 np0005535656 nova_compute[187219]: 2025-11-25 18:57:15.085 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:15 np0005535656 nova_compute[187219]: 2025-11-25 18:57:15.478 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:16 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:16Z|00058|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 13:57:16 np0005535656 podman[210495]: 2025-11-25 18:57:16.932466111 +0000 UTC m=+0.056800684 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:57:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:57:20 np0005535656 nova_compute[187219]: 2025-11-25 18:57:20.088 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:20 np0005535656 nova_compute[187219]: 2025-11-25 18:57:20.480 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:25 np0005535656 nova_compute[187219]: 2025-11-25 18:57:25.090 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:25 np0005535656 nova_compute[187219]: 2025-11-25 18:57:25.482 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:25 np0005535656 podman[210521]: 2025-11-25 18:57:25.967429412 +0000 UTC m=+0.085406562 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 13:57:25 np0005535656 podman[210520]: 2025-11-25 18:57:25.994095067 +0000 UTC m=+0.116238859 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 13:57:30 np0005535656 nova_compute[187219]: 2025-11-25 18:57:30.093 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:30 np0005535656 nova_compute[187219]: 2025-11-25 18:57:30.484 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:32 np0005535656 podman[210564]: 2025-11-25 18:57:32.934230147 +0000 UTC m=+0.059917438 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9)
Nov 25 13:57:35 np0005535656 nova_compute[187219]: 2025-11-25 18:57:35.094 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:35 np0005535656 nova_compute[187219]: 2025-11-25 18:57:35.485 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:35 np0005535656 podman[197580]: time="2025-11-25T18:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:57:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 13:57:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 25 13:57:35 np0005535656 podman[210585]: 2025-11-25 18:57:35.974496115 +0000 UTC m=+0.093583800 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 13:57:40 np0005535656 nova_compute[187219]: 2025-11-25 18:57:40.096 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:40 np0005535656 nova_compute[187219]: 2025-11-25 18:57:40.486 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.306 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.306 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.333 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.467 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.467 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.479 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.479 187223 INFO nova.compute.claims [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.672 187223 DEBUG nova.compute.provider_tree [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.859 187223 DEBUG nova.scheduler.client.report [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.982 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:42 np0005535656 nova_compute[187219]: 2025-11-25 18:57:42.983 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.158 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.158 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.198 187223 INFO nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.232 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.384 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.386 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.386 187223 INFO nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Creating image(s)#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.387 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.387 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.388 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.406 187223 DEBUG nova.policy [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '752e3dfa795e4fd781c1bbb04a2f8e22', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c58d5fa6c9449d0ade4f6e196f5da2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.409 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.508 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.509 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.509 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.519 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.572 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.573 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.683 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk 1073741824" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.685 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.685 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.756 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.757 187223 DEBUG nova.virt.disk.api [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Checking if we can resize image /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.758 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.808 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.809 187223 DEBUG nova.virt.disk.api [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Cannot resize image /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.810 187223 DEBUG nova.objects.instance [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lazy-loading 'migration_context' on Instance uuid bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.861 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.861 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Ensure instance console log exists: /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.862 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.863 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:43 np0005535656 nova_compute[187219]: 2025-11-25 18:57:43.863 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:44 np0005535656 nova_compute[187219]: 2025-11-25 18:57:44.650 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Successfully created port: 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 13:57:45 np0005535656 nova_compute[187219]: 2025-11-25 18:57:45.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:45 np0005535656 nova_compute[187219]: 2025-11-25 18:57:45.492 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:45 np0005535656 nova_compute[187219]: 2025-11-25 18:57:45.777 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Successfully updated port: 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.012 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.012 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquired lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.013 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.075 187223 DEBUG nova.compute.manager [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-changed-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.075 187223 DEBUG nova.compute.manager [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Refreshing instance network info cache due to event network-changed-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.076 187223 DEBUG oslo_concurrency.lockutils [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:57:46 np0005535656 nova_compute[187219]: 2025-11-25 18:57:46.395 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.428 187223 DEBUG nova.network.neutron [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updating instance_info_cache with network_info: [{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.542 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Releasing lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.543 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Instance network_info: |[{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.544 187223 DEBUG oslo_concurrency.lockutils [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.544 187223 DEBUG nova.network.neutron [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Refreshing network info cache for port 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.550 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Start _get_guest_xml network_info=[{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.556 187223 WARNING nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.563 187223 DEBUG nova.virt.libvirt.host [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.564 187223 DEBUG nova.virt.libvirt.host [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.570 187223 DEBUG nova.virt.libvirt.host [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.571 187223 DEBUG nova.virt.libvirt.host [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.573 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.574 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.575 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.575 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.576 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.576 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.577 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.577 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.578 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.579 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.579 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.579 187223 DEBUG nova.virt.hardware [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.587 187223 DEBUG nova.virt.libvirt.vif [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1978486513',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1978486513',id=7,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c58d5fa6c9449d0ade4f6e196f5da2b',ramdisk_id='',reservation_id='r-0b3zgyo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1744823125',owner_user_name='tempest-TestExecuteBasicStrategy-1744823125-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:57:43Z,user_data=None,user_id='752e3dfa795e4fd781c1bbb04a2f8e22',uuid=bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.588 187223 DEBUG nova.network.os_vif_util [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converting VIF {"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.589 187223 DEBUG nova.network.os_vif_util [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.591 187223 DEBUG nova.objects.instance [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lazy-loading 'pci_devices' on Instance uuid bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.671 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] End _get_guest_xml xml=<domain type="kvm">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <uuid>bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6</uuid>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <name>instance-00000007</name>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteBasicStrategy-server-1978486513</nova:name>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 18:57:47</nova:creationTime>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:user uuid="752e3dfa795e4fd781c1bbb04a2f8e22">tempest-TestExecuteBasicStrategy-1744823125-project-member</nova:user>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:project uuid="5c58d5fa6c9449d0ade4f6e196f5da2b">tempest-TestExecuteBasicStrategy-1744823125</nova:project>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        <nova:port uuid="3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <system>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="serial">bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="uuid">bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </system>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <os>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </os>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <features>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </features>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </clock>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  <devices>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.config"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </disk>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:1c:5b:c3"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <target dev="tap3021cfeb-4e"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </interface>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/console.log" append="off"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </serial>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <video>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </video>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </rng>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 13:57:47 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 13:57:47 np0005535656 nova_compute[187219]:  </devices>
Nov 25 13:57:47 np0005535656 nova_compute[187219]: </domain>
Nov 25 13:57:47 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.672 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Preparing to wait for external event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.673 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.673 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.674 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.675 187223 DEBUG nova.virt.libvirt.vif [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T18:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1978486513',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1978486513',id=7,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c58d5fa6c9449d0ade4f6e196f5da2b',ramdisk_id='',reservation_id='r-0b3zgyo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1744823125',owner_user_name='tempest-TestExecuteBasicStrategy-1744823125-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:57:43Z,user_data=None,user_id='752e3dfa795e4fd781c1bbb04a2f8e22',uuid=bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.675 187223 DEBUG nova.network.os_vif_util [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converting VIF {"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.676 187223 DEBUG nova.network.os_vif_util [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.677 187223 DEBUG os_vif [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.677 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.678 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.679 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.679 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.682 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.683 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3021cfeb-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.684 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3021cfeb-4e, col_values=(('external_ids', {'iface-id': '3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:5b:c3', 'vm-uuid': 'bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:47 np0005535656 NetworkManager[55548]: <info>  [1764097067.7152] manager: (tap3021cfeb-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.715 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.717 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.723 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:47 np0005535656 nova_compute[187219]: 2025-11-25 18:57:47.724 187223 INFO os_vif [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e')#033[00m
Nov 25 13:57:47 np0005535656 podman[210622]: 2025-11-25 18:57:47.96412678 +0000 UTC m=+0.078738522 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 13:57:48 np0005535656 nova_compute[187219]: 2025-11-25 18:57:48.100 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:57:48 np0005535656 nova_compute[187219]: 2025-11-25 18:57:48.101 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 13:57:48 np0005535656 nova_compute[187219]: 2025-11-25 18:57:48.101 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] No VIF found with MAC fa:16:3e:1c:5b:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 13:57:48 np0005535656 nova_compute[187219]: 2025-11-25 18:57:48.102 187223 INFO nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Using config drive#033[00m
Nov 25 13:57:48 np0005535656 nova_compute[187219]: 2025-11-25 18:57:48.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.273 187223 INFO nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Creating config drive at /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.config#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.283 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaeu0i48h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.412 187223 DEBUG oslo_concurrency.processutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaeu0i48h" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:57:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:57:49 np0005535656 kernel: tap3021cfeb-4e: entered promiscuous mode
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.4909] manager: (tap3021cfeb-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.491 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:49Z|00059|binding|INFO|Claiming lport 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa for this chassis.
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.493 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:49Z|00060|binding|INFO|3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa: Claiming fa:16:3e:1c:5b:c3 10.100.0.3
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.495 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 systemd-udevd[210666]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.5311] device (tap3021cfeb-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.5326] device (tap3021cfeb-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:57:49 np0005535656 systemd-machined[153481]: New machine qemu-5-instance-00000007.
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.537 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:5b:c3 10.100.0.3'], port_security=['fa:16:3e:1c:5b:c3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c58d5fa6c9449d0ade4f6e196f5da2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a99ea8d-0bac-44ff-8604-506502702def', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61f992e8-7ad3-4f4f-9d1f-85cebb75b26e, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.538 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa in datapath ec24c862-d31e-4059-8ccd-fa96e33dc558 bound to our chassis#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.540 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec24c862-d31e-4059-8ccd-fa96e33dc558#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.546 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.549 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5123a2-b5db-46e8-8527-52d51e921171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.550 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec24c862-d1 in ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.551 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec24c862-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.551 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[da8b410f-3d91-46af-a585-b821a4f50d52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.552 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2d825d38-8144-4e8b-9cc5-efc6282b9456]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:49Z|00061|binding|INFO|Setting lport 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa ovn-installed in OVS
Nov 25 13:57:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:49Z|00062|binding|INFO|Setting lport 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa up in Southbound
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.553 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.564 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[cb80ed02-47c1-47f9-8b83-2d4618774370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.596 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[38a53031-0d2a-49e2-80ab-6fdef6dce6a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.624 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[497cb3b2-450d-4d83-b2ad-797e8491acff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.631 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[46a0fd93-0b5e-47df-af03-594f47dffa42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.6321] manager: (tapec24c862-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.654 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[31477db2-0474-4525-b66a-0f5544f77e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.660 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d52625-56d1-435e-b72b-20bacb65552e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.6838] device (tapec24c862-d0): carrier: link connected
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.686 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c4bc4c-c38a-4c79-b81a-570d596f4133]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.704 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[dde3178e-1968-4cc7-8f00-70efea6a7cef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec24c862-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3a:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403120, 'reachable_time': 39128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210704, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.725 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[891c8c64-d820-4ee2-870d-8feebb40b837]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:3a47'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403120, 'tstamp': 403120}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210705, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.748 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4432b9-04b2-4054-b883-7f72ac12a131]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec24c862-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3a:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403120, 'reachable_time': 39128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210706, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.792 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[27c75950-30fa-4877-878d-079881ff30d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.847 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[27700001-c65e-47ec-b1df-690fb9243e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.848 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec24c862-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.848 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.849 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec24c862-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.875 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 kernel: tapec24c862-d0: entered promiscuous mode
Nov 25 13:57:49 np0005535656 NetworkManager[55548]: <info>  [1764097069.8766] manager: (tapec24c862-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.878 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.878 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec24c862-d0, col_values=(('external_ids', {'iface-id': '55ac4047-68e6-4347-bbd5-a2ba83dd6713'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:57:49 np0005535656 ovn_controller[95460]: 2025-11-25T18:57:49Z|00063|binding|INFO|Releasing lport 55ac4047-68e6-4347-bbd5-a2ba83dd6713 from this chassis (sb_readonly=0)
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.879 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.902 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.904 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec24c862-d31e-4059-8ccd-fa96e33dc558.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec24c862-d31e-4059-8ccd-fa96e33dc558.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.906 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[224a5d74-4a09-44e6-9060-064a54a1485c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.907 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-ec24c862-d31e-4059-8ccd-fa96e33dc558
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/ec24c862-d31e-4059-8ccd-fa96e33dc558.pid.haproxy
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID ec24c862-d31e-4059-8ccd-fa96e33dc558
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 13:57:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:49.910 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'env', 'PROCESS_TAG=haproxy-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec24c862-d31e-4059-8ccd-fa96e33dc558.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.992 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097069.9924052, bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:57:49 np0005535656 nova_compute[187219]: 2025-11-25 18:57:49.993 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] VM Started (Lifecycle Event)#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.040 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.043 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097069.9925313, bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.044 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] VM Paused (Lifecycle Event)#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.099 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.117 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.119 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.144 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:57:50 np0005535656 podman[210745]: 2025-11-25 18:57:50.273375209 +0000 UTC m=+0.021166968 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.491 187223 DEBUG nova.network.neutron [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updated VIF entry in instance network info cache for port 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.491 187223 DEBUG nova.network.neutron [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updating instance_info_cache with network_info: [{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.574 187223 DEBUG oslo_concurrency.lockutils [req-a9365197-a1d0-43d6-bbc7-e99bd0b96e25 req-31dcc134-fb70-4024-9d41-597106bc71cd 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:57:50 np0005535656 podman[210745]: 2025-11-25 18:57:50.584480701 +0000 UTC m=+0.332272430 container create 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 13:57:50 np0005535656 systemd[1]: Started libpod-conmon-6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c.scope.
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.711 187223 DEBUG nova.compute.manager [req-933b4289-6151-4105-ae25-5a117e994cf3 req-1953e74e-5ac4-4d28-9f92-3ba780cd104e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.713 187223 DEBUG oslo_concurrency.lockutils [req-933b4289-6151-4105-ae25-5a117e994cf3 req-1953e74e-5ac4-4d28-9f92-3ba780cd104e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.713 187223 DEBUG oslo_concurrency.lockutils [req-933b4289-6151-4105-ae25-5a117e994cf3 req-1953e74e-5ac4-4d28-9f92-3ba780cd104e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.713 187223 DEBUG oslo_concurrency.lockutils [req-933b4289-6151-4105-ae25-5a117e994cf3 req-1953e74e-5ac4-4d28-9f92-3ba780cd104e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.714 187223 DEBUG nova.compute.manager [req-933b4289-6151-4105-ae25-5a117e994cf3 req-1953e74e-5ac4-4d28-9f92-3ba780cd104e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Processing event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.714 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.719 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097070.7190537, bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.719 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:57:50 np0005535656 systemd[1]: Started libcrun container.
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.721 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.724 187223 INFO nova.virt.libvirt.driver [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Instance spawned successfully.#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.725 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 13:57:50 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae13029a561b7faf15c5f2a9ef0964a6ef02f98c7fa69c812cf5f3c9e7809d3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 13:57:50 np0005535656 podman[210745]: 2025-11-25 18:57:50.816189443 +0000 UTC m=+0.563981222 container init 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 25 13:57:50 np0005535656 podman[210745]: 2025-11-25 18:57:50.822118322 +0000 UTC m=+0.569910061 container start 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.834 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.840 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.844 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.845 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.845 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.846 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.846 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 nova_compute[187219]: 2025-11-25 18:57:50.847 187223 DEBUG nova.virt.libvirt.driver [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 13:57:50 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [NOTICE]   (210764) : New worker (210766) forked
Nov 25 13:57:50 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [NOTICE]   (210764) : Loading success.
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.081 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.353 187223 INFO nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Took 7.97 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.355 187223 DEBUG nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.807 187223 INFO nova.compute.manager [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Took 9.38 seconds to build instance.#033[00m
Nov 25 13:57:51 np0005535656 nova_compute[187219]: 2025-11-25 18:57:51.968 187223 DEBUG oslo_concurrency.lockutils [None req-03cc7e01-13c6-4f55-b632-0f3cd8f19c18 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.422 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.423 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.424 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.424 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.717 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.916 187223 DEBUG nova.compute.manager [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.917 187223 DEBUG oslo_concurrency.lockutils [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.918 187223 DEBUG oslo_concurrency.lockutils [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.918 187223 DEBUG oslo_concurrency.lockutils [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.919 187223 DEBUG nova.compute.manager [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] No waiting events found dispatching network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:57:52 np0005535656 nova_compute[187219]: 2025-11-25 18:57:52.919 187223 WARNING nova.compute.manager [req-37ab5e40-4acc-479c-8efc-568a4c5f4c1e req-328add9b-d05f-41f6-98b7-9b30cb491e37 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received unexpected event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa for instance with vm_state active and task_state None.#033[00m
Nov 25 13:57:53 np0005535656 nova_compute[187219]: 2025-11-25 18:57:53.018 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:53.018 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:57:53 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:53.022 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.102 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.625 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updating instance_info_cache with network_info: [{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.655 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.656 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.656 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.657 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.657 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:55 np0005535656 nova_compute[187219]: 2025-11-25 18:57:55.657 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.691 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.692 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.692 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.692 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.765 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:56 np0005535656 podman[210777]: 2025-11-25 18:57:56.787010595 +0000 UTC m=+0.049621542 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 25 13:57:56 np0005535656 podman[210776]: 2025-11-25 18:57:56.822220449 +0000 UTC m=+0.089368597 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.842 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.843 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:57:56 np0005535656 nova_compute[187219]: 2025-11-25 18:57:56.895 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.069 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.071 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=73.16703414916992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.071 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.071 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.184 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.184 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.185 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.203 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.234 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.235 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.251 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.291 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.384 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.410 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.446 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.447 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:57:57 np0005535656 nova_compute[187219]: 2025-11-25 18:57:57.719 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:57:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:59.073 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:57:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:59.075 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:57:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:57:59.076 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:58:00 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:58:00.025 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:58:00 np0005535656 nova_compute[187219]: 2025-11-25 18:58:00.105 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:01 np0005535656 nova_compute[187219]: 2025-11-25 18:58:01.448 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:02 np0005535656 nova_compute[187219]: 2025-11-25 18:58:02.721 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:03 np0005535656 podman[210827]: 2025-11-25 18:58:03.936530668 +0000 UTC m=+0.054552073 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, architecture=x86_64)
Nov 25 13:58:05 np0005535656 nova_compute[187219]: 2025-11-25 18:58:05.106 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:05 np0005535656 podman[197580]: time="2025-11-25T18:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:58:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:58:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Nov 25 13:58:05 np0005535656 ovn_controller[95460]: 2025-11-25T18:58:05Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:5b:c3 10.100.0.3
Nov 25 13:58:05 np0005535656 ovn_controller[95460]: 2025-11-25T18:58:05Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:5b:c3 10.100.0.3
Nov 25 13:58:06 np0005535656 podman[210869]: 2025-11-25 18:58:06.947925502 +0000 UTC m=+0.072305129 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 13:58:07 np0005535656 nova_compute[187219]: 2025-11-25 18:58:07.724 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:10 np0005535656 nova_compute[187219]: 2025-11-25 18:58:10.108 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:12 np0005535656 nova_compute[187219]: 2025-11-25 18:58:12.727 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:15 np0005535656 nova_compute[187219]: 2025-11-25 18:58:15.111 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:17 np0005535656 nova_compute[187219]: 2025-11-25 18:58:17.771 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:18 np0005535656 podman[210890]: 2025-11-25 18:58:18.927356992 +0000 UTC m=+0.049865238 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:58:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:58:20 np0005535656 nova_compute[187219]: 2025-11-25 18:58:20.114 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:22 np0005535656 nova_compute[187219]: 2025-11-25 18:58:22.773 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:25 np0005535656 nova_compute[187219]: 2025-11-25 18:58:25.114 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:26 np0005535656 podman[210917]: 2025-11-25 18:58:26.92038963 +0000 UTC m=+0.044997177 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 13:58:26 np0005535656 podman[210916]: 2025-11-25 18:58:26.953645182 +0000 UTC m=+0.077765757 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 13:58:27 np0005535656 nova_compute[187219]: 2025-11-25 18:58:27.775 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:30 np0005535656 nova_compute[187219]: 2025-11-25 18:58:30.117 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:30 np0005535656 ovn_controller[95460]: 2025-11-25T18:58:30Z|00064|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 25 13:58:32 np0005535656 nova_compute[187219]: 2025-11-25 18:58:32.776 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:34 np0005535656 podman[210962]: 2025-11-25 18:58:34.937090711 +0000 UTC m=+0.059040674 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 13:58:35 np0005535656 nova_compute[187219]: 2025-11-25 18:58:35.118 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:35 np0005535656 podman[197580]: time="2025-11-25T18:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:58:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:58:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Nov 25 13:58:37 np0005535656 nova_compute[187219]: 2025-11-25 18:58:37.781 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:37 np0005535656 podman[210983]: 2025-11-25 18:58:37.938422813 +0000 UTC m=+0.063690778 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 13:58:40 np0005535656 nova_compute[187219]: 2025-11-25 18:58:40.121 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:42 np0005535656 nova_compute[187219]: 2025-11-25 18:58:42.785 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:45 np0005535656 nova_compute[187219]: 2025-11-25 18:58:45.124 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:47 np0005535656 nova_compute[187219]: 2025-11-25 18:58:47.788 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:48 np0005535656 nova_compute[187219]: 2025-11-25 18:58:48.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:58:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:58:49 np0005535656 podman[211004]: 2025-11-25 18:58:49.96771489 +0000 UTC m=+0.078742412 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 13:58:50 np0005535656 nova_compute[187219]: 2025-11-25 18:58:50.127 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:50 np0005535656 nova_compute[187219]: 2025-11-25 18:58:50.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:50 np0005535656 nova_compute[187219]: 2025-11-25 18:58:50.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:51 np0005535656 nova_compute[187219]: 2025-11-25 18:58:51.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:51 np0005535656 nova_compute[187219]: 2025-11-25 18:58:51.698 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:51 np0005535656 nova_compute[187219]: 2025-11-25 18:58:51.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:58:51 np0005535656 nova_compute[187219]: 2025-11-25 18:58:51.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:58:52 np0005535656 nova_compute[187219]: 2025-11-25 18:58:52.473 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:58:52 np0005535656 nova_compute[187219]: 2025-11-25 18:58:52.474 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:58:52 np0005535656 nova_compute[187219]: 2025-11-25 18:58:52.474 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 13:58:52 np0005535656 nova_compute[187219]: 2025-11-25 18:58:52.475 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:58:52 np0005535656 nova_compute[187219]: 2025-11-25 18:58:52.790 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:53 np0005535656 nova_compute[187219]: 2025-11-25 18:58:53.954 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updating instance_info_cache with network_info: [{"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:58:54 np0005535656 nova_compute[187219]: 2025-11-25 18:58:54.014 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:58:54 np0005535656 nova_compute[187219]: 2025-11-25 18:58:54.015 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 13:58:54 np0005535656 nova_compute[187219]: 2025-11-25 18:58:54.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:54 np0005535656 nova_compute[187219]: 2025-11-25 18:58:54.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:58:55 np0005535656 nova_compute[187219]: 2025-11-25 18:58:55.128 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:55 np0005535656 nova_compute[187219]: 2025-11-25 18:58:55.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:55 np0005535656 nova_compute[187219]: 2025-11-25 18:58:55.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.710 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.710 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.788 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.878 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.880 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:58:56 np0005535656 nova_compute[187219]: 2025-11-25 18:58:56.967 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.226 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.229 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5737MB free_disk=73.13908004760742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.229 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.230 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.359 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.359 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.360 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.451 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.474 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.476 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.477 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:58:57 np0005535656 nova_compute[187219]: 2025-11-25 18:58:57.795 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:58:57 np0005535656 podman[211035]: 2025-11-25 18:58:57.985807729 +0000 UTC m=+0.086135315 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 13:58:58 np0005535656 podman[211034]: 2025-11-25 18:58:58.016372288 +0000 UTC m=+0.122727685 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:58:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:58:59.073 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:58:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:58:59.074 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:58:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:58:59.075 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:00 np0005535656 nova_compute[187219]: 2025-11-25 18:59:00.131 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:01 np0005535656 nova_compute[187219]: 2025-11-25 18:59:01.477 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:02 np0005535656 nova_compute[187219]: 2025-11-25 18:59:02.800 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:05 np0005535656 nova_compute[187219]: 2025-11-25 18:59:05.133 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:05 np0005535656 podman[197580]: time="2025-11-25T18:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:59:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:59:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 25 13:59:05 np0005535656 podman[211079]: 2025-11-25 18:59:05.951171175 +0000 UTC m=+0.071109944 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 13:59:07 np0005535656 nova_compute[187219]: 2025-11-25 18:59:07.803 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:08 np0005535656 podman[211114]: 2025-11-25 18:59:08.974086645 +0000 UTC m=+0.087260096 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 13:59:10 np0005535656 nova_compute[187219]: 2025-11-25 18:59:10.135 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:12 np0005535656 nova_compute[187219]: 2025-11-25 18:59:12.807 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:15 np0005535656 nova_compute[187219]: 2025-11-25 18:59:15.043 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Creating tmpfile /var/lib/nova/instances/tmp4fa5jgu1 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 25 13:59:15 np0005535656 nova_compute[187219]: 2025-11-25 18:59:15.137 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:15 np0005535656 nova_compute[187219]: 2025-11-25 18:59:15.182 187223 DEBUG nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4fa5jgu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 25 13:59:17 np0005535656 nova_compute[187219]: 2025-11-25 18:59:17.173 187223 DEBUG nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4fa5jgu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22cb9435-5d2b-429f-947c-0d6ce107ed06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 25 13:59:17 np0005535656 nova_compute[187219]: 2025-11-25 18:59:17.206 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:59:17 np0005535656 nova_compute[187219]: 2025-11-25 18:59:17.206 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:59:17 np0005535656 nova_compute[187219]: 2025-11-25 18:59:17.206 187223 DEBUG nova.network.neutron [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:59:17 np0005535656 nova_compute[187219]: 2025-11-25 18:59:17.811 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:59:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.660 187223 DEBUG nova.network.neutron [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Updating instance_info_cache with network_info: [{"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.695 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.697 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4fa5jgu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22cb9435-5d2b-429f-947c-0d6ce107ed06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.698 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Creating instance directory: /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.699 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Creating disk.info with the contents: {'/var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk': 'qcow2', '/var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.699 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.700 187223 DEBUG nova.objects.instance [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 22cb9435-5d2b-429f-947c-0d6ce107ed06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.740 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.836 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.837 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.838 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.861 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.935 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:19 np0005535656 nova_compute[187219]: 2025-11-25 18:59:19.937 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.046 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk 1073741824" returned: 0 in 0.110s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.047 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.048 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.132 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.133 187223 DEBUG nova.virt.disk.api [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Checking if we can resize image /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.133 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.149 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.198 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.199 187223 DEBUG nova.virt.disk.api [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Cannot resize image /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.200 187223 DEBUG nova.objects.instance [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 22cb9435-5d2b-429f-947c-0d6ce107ed06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.274 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.299 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.301 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config to /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.302 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.826 187223 DEBUG oslo_concurrency.processutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06/disk.config /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.828 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.830 187223 DEBUG nova.virt.libvirt.vif [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:57:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-911425224',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-911425224',id=8,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:58:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5c58d5fa6c9449d0ade4f6e196f5da2b',ramdisk_id='',reservation_id='r-s30iyvtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1744823125',owner_user_name='tempest-TestExecuteBasicStrategy-1744823125-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T18:58:07Z,user_data=None,user_id='752e3dfa795e4fd781c1bbb04a2f8e22',uuid=22cb9435-5d2b-429f-947c-0d6ce107ed06,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.831 187223 DEBUG nova.network.os_vif_util [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.832 187223 DEBUG nova.network.os_vif_util [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.833 187223 DEBUG os_vif [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.834 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.835 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.836 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.840 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.840 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf55e8d40-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.841 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf55e8d40-34, col_values=(('external_ids', {'iface-id': 'f55e8d40-34a7-47d9-b278-5b1b90791f49', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:04:72', 'vm-uuid': '22cb9435-5d2b-429f-947c-0d6ce107ed06'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:20 np0005535656 NetworkManager[55548]: <info>  [1764097160.8448] manager: (tapf55e8d40-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.845 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.855 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.855 187223 INFO os_vif [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34')#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.856 187223 DEBUG nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 25 13:59:20 np0005535656 nova_compute[187219]: 2025-11-25 18:59:20.856 187223 DEBUG nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4fa5jgu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22cb9435-5d2b-429f-947c-0d6ce107ed06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 25 13:59:20 np0005535656 podman[211160]: 2025-11-25 18:59:20.95926531 +0000 UTC m=+0.069354196 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:59:22 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:22.096 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:59:22 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:22.097 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 13:59:22 np0005535656 nova_compute[187219]: 2025-11-25 18:59:22.097 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:22 np0005535656 nova_compute[187219]: 2025-11-25 18:59:22.989 187223 DEBUG nova.network.neutron [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Port f55e8d40-34a7-47d9-b278-5b1b90791f49 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 25 13:59:22 np0005535656 nova_compute[187219]: 2025-11-25 18:59:22.991 187223 DEBUG nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4fa5jgu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='22cb9435-5d2b-429f-947c-0d6ce107ed06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 25 13:59:23 np0005535656 systemd[1]: Starting libvirt proxy daemon...
Nov 25 13:59:23 np0005535656 systemd[1]: Started libvirt proxy daemon.
Nov 25 13:59:23 np0005535656 NetworkManager[55548]: <info>  [1764097163.3022] manager: (tapf55e8d40-34): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Nov 25 13:59:23 np0005535656 kernel: tapf55e8d40-34: entered promiscuous mode
Nov 25 13:59:23 np0005535656 nova_compute[187219]: 2025-11-25 18:59:23.305 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:23 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:23Z|00065|binding|INFO|Claiming lport f55e8d40-34a7-47d9-b278-5b1b90791f49 for this additional chassis.
Nov 25 13:59:23 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:23Z|00066|binding|INFO|f55e8d40-34a7-47d9-b278-5b1b90791f49: Claiming fa:16:3e:9c:04:72 10.100.0.14
Nov 25 13:59:23 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:23Z|00067|binding|INFO|Setting lport f55e8d40-34a7-47d9-b278-5b1b90791f49 ovn-installed in OVS
Nov 25 13:59:23 np0005535656 nova_compute[187219]: 2025-11-25 18:59:23.320 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:23 np0005535656 nova_compute[187219]: 2025-11-25 18:59:23.323 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:23 np0005535656 systemd-machined[153481]: New machine qemu-6-instance-00000008.
Nov 25 13:59:23 np0005535656 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Nov 25 13:59:23 np0005535656 systemd-udevd[211218]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 13:59:23 np0005535656 NetworkManager[55548]: <info>  [1764097163.3849] device (tapf55e8d40-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 13:59:23 np0005535656 NetworkManager[55548]: <info>  [1764097163.3870] device (tapf55e8d40-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 13:59:24 np0005535656 nova_compute[187219]: 2025-11-25 18:59:24.015 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097164.0139718, 22cb9435-5d2b-429f-947c-0d6ce107ed06 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:59:24 np0005535656 nova_compute[187219]: 2025-11-25 18:59:24.018 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] VM Started (Lifecycle Event)#033[00m
Nov 25 13:59:24 np0005535656 nova_compute[187219]: 2025-11-25 18:59:24.057 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.042 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097165.0426168, 22cb9435-5d2b-429f-947c-0d6ce107ed06 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.043 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] VM Resumed (Lifecycle Event)#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.084 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.088 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.113 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.141 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:25 np0005535656 nova_compute[187219]: 2025-11-25 18:59:25.844 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:26 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:26Z|00068|binding|INFO|Claiming lport f55e8d40-34a7-47d9-b278-5b1b90791f49 for this chassis.
Nov 25 13:59:26 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:26Z|00069|binding|INFO|f55e8d40-34a7-47d9-b278-5b1b90791f49: Claiming fa:16:3e:9c:04:72 10.100.0.14
Nov 25 13:59:26 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:26Z|00070|binding|INFO|Setting lport f55e8d40-34a7-47d9-b278-5b1b90791f49 up in Southbound
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.291 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:04:72 10.100.0.14'], port_security=['fa:16:3e:9c:04:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '22cb9435-5d2b-429f-947c-0d6ce107ed06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c58d5fa6c9449d0ade4f6e196f5da2b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7a99ea8d-0bac-44ff-8604-506502702def', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61f992e8-7ad3-4f4f-9d1f-85cebb75b26e, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=f55e8d40-34a7-47d9-b278-5b1b90791f49) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.293 104346 INFO neutron.agent.ovn.metadata.agent [-] Port f55e8d40-34a7-47d9-b278-5b1b90791f49 in datapath ec24c862-d31e-4059-8ccd-fa96e33dc558 bound to our chassis#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.295 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec24c862-d31e-4059-8ccd-fa96e33dc558#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.311 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c41773-61c5-4e10-8a15-62c85e515e53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.348 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d34d97-3b4f-43c7-bc5c-1201e6a8e5e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.351 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[3848ca3f-aed7-4c50-b0ed-2347536d0e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.392 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[e25632d0-f281-4a76-b0d7-57c046864371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.418 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[fb823b7f-ddf2-4d44-a770-afbcf452ffd8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec24c862-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3a:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403120, 'reachable_time': 40215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211253, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.443 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a36703b8-598a-4eac-8256-e59382b4a030]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec24c862-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403134, 'tstamp': 403134}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211254, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec24c862-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403136, 'tstamp': 403136}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211254, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.445 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec24c862-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:26 np0005535656 nova_compute[187219]: 2025-11-25 18:59:26.447 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.450 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec24c862-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.451 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.451 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec24c862-d0, col_values=(('external_ids', {'iface-id': '55ac4047-68e6-4347-bbd5-a2ba83dd6713'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:26.452 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:59:26 np0005535656 nova_compute[187219]: 2025-11-25 18:59:26.567 187223 INFO nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Post operation of migration started#033[00m
Nov 25 13:59:26 np0005535656 nova_compute[187219]: 2025-11-25 18:59:26.865 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 13:59:26 np0005535656 nova_compute[187219]: 2025-11-25 18:59:26.866 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 13:59:26 np0005535656 nova_compute[187219]: 2025-11-25 18:59:26.867 187223 DEBUG nova.network.neutron [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.240 187223 DEBUG nova.network.neutron [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Updating instance_info_cache with network_info: [{"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.321 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-22cb9435-5d2b-429f-947c-0d6ce107ed06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.404 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.405 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.405 187223 DEBUG oslo_concurrency.lockutils [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:28 np0005535656 nova_compute[187219]: 2025-11-25 18:59:28.411 187223 INFO nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 25 13:59:28 np0005535656 virtqemud[186765]: Domain id=6 name='instance-00000008' uuid=22cb9435-5d2b-429f-947c-0d6ce107ed06 is tainted: custom-monitor
Nov 25 13:59:28 np0005535656 podman[211256]: 2025-11-25 18:59:28.995032491 +0000 UTC m=+0.091025057 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 13:59:29 np0005535656 podman[211255]: 2025-11-25 18:59:29.043323112 +0000 UTC m=+0.145379301 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 25 13:59:29 np0005535656 nova_compute[187219]: 2025-11-25 18:59:29.420 187223 INFO nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 25 13:59:30 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:30.100 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:30 np0005535656 nova_compute[187219]: 2025-11-25 18:59:30.144 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:30 np0005535656 nova_compute[187219]: 2025-11-25 18:59:30.429 187223 INFO nova.virt.libvirt.driver [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 25 13:59:30 np0005535656 nova_compute[187219]: 2025-11-25 18:59:30.437 187223 DEBUG nova.compute.manager [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:59:30 np0005535656 nova_compute[187219]: 2025-11-25 18:59:30.674 187223 DEBUG nova.objects.instance [None req-3a6a4d1b-96af-4b06-94c4-c76b1dafb77d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 25 13:59:30 np0005535656 nova_compute[187219]: 2025-11-25 18:59:30.846 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.185 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.302 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "22cb9435-5d2b-429f-947c-0d6ce107ed06" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.303 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.303 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.303 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.304 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.305 187223 INFO nova.compute.manager [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Terminating instance#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.306 187223 DEBUG nova.compute.manager [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 13:59:35 np0005535656 kernel: tapf55e8d40-34 (unregistering): left promiscuous mode
Nov 25 13:59:35 np0005535656 NetworkManager[55548]: <info>  [1764097175.3328] device (tapf55e8d40-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:59:35 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:35Z|00071|binding|INFO|Releasing lport f55e8d40-34a7-47d9-b278-5b1b90791f49 from this chassis (sb_readonly=0)
Nov 25 13:59:35 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:35Z|00072|binding|INFO|Setting lport f55e8d40-34a7-47d9-b278-5b1b90791f49 down in Southbound
Nov 25 13:59:35 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:35Z|00073|binding|INFO|Removing iface tapf55e8d40-34 ovn-installed in OVS
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.346 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.352 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:04:72 10.100.0.14'], port_security=['fa:16:3e:9c:04:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '22cb9435-5d2b-429f-947c-0d6ce107ed06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c58d5fa6c9449d0ade4f6e196f5da2b', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7a99ea8d-0bac-44ff-8604-506502702def', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61f992e8-7ad3-4f4f-9d1f-85cebb75b26e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=f55e8d40-34a7-47d9-b278-5b1b90791f49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.354 104346 INFO neutron.agent.ovn.metadata.agent [-] Port f55e8d40-34a7-47d9-b278-5b1b90791f49 in datapath ec24c862-d31e-4059-8ccd-fa96e33dc558 unbound from our chassis#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.355 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec24c862-d31e-4059-8ccd-fa96e33dc558#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.367 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.380 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[943392b3-054d-4c67-b47b-a4fb9a839bc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 25 13:59:35 np0005535656 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 1.826s CPU time.
Nov 25 13:59:35 np0005535656 systemd-machined[153481]: Machine qemu-6-instance-00000008 terminated.
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.427 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[08bfc9a1-05cb-4aa9-8efb-0117555567f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.430 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad88657-491a-4e90-9d4a-5a16fa2aa696]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.453 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b5805bee-365b-4554-9c0b-7a300e9b5bf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.467 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd65e25-7ab7-4b72-9553-1eea0c0ec4cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec24c862-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3a:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403120, 'reachable_time': 40215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211315, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.479 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a21b67bd-f898-44b7-8d78-aacd275dbf21]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec24c862-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403134, 'tstamp': 403134}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211316, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec24c862-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403136, 'tstamp': 403136}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211316, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.480 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec24c862-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.482 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.486 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.486 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec24c862-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec24c862-d0, col_values=(('external_ids', {'iface-id': '55ac4047-68e6-4347-bbd5-a2ba83dd6713'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:35.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.523 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.527 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.573 187223 INFO nova.virt.libvirt.driver [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Instance destroyed successfully.#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.573 187223 DEBUG nova.objects.instance [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lazy-loading 'resources' on Instance uuid 22cb9435-5d2b-429f-947c-0d6ce107ed06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.592 187223 DEBUG nova.virt.libvirt.vif [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-25T18:57:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-911425224',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-911425224',id=8,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:58:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c58d5fa6c9449d0ade4f6e196f5da2b',ramdisk_id='',reservation_id='r-s30iyvtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1744823125',owner_user_name='tempest-TestExecuteBasicStrategy-1744823125-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:59:30Z,user_data=None,user_id='752e3dfa795e4fd781c1bbb04a2f8e22',uuid=22cb9435-5d2b-429f-947c-0d6ce107ed06,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.592 187223 DEBUG nova.network.os_vif_util [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converting VIF {"id": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "address": "fa:16:3e:9c:04:72", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf55e8d40-34", "ovs_interfaceid": "f55e8d40-34a7-47d9-b278-5b1b90791f49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.592 187223 DEBUG nova.network.os_vif_util [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.593 187223 DEBUG os_vif [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.594 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.594 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf55e8d40-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.596 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.597 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.599 187223 INFO os_vif [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:04:72,bridge_name='br-int',has_traffic_filtering=True,id=f55e8d40-34a7-47d9-b278-5b1b90791f49,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf55e8d40-34')#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.600 187223 INFO nova.virt.libvirt.driver [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Deleting instance files /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06_del#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.600 187223 INFO nova.virt.libvirt.driver [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Deletion of /var/lib/nova/instances/22cb9435-5d2b-429f-947c-0d6ce107ed06_del complete#033[00m
Nov 25 13:59:35 np0005535656 podman[197580]: time="2025-11-25T18:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 13:59:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.656 187223 INFO nova.compute.manager [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.657 187223 DEBUG oslo.service.loopingcall [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.657 187223 DEBUG nova.compute.manager [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.657 187223 DEBUG nova.network.neutron [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 13:59:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:18:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.836 187223 DEBUG nova.compute.manager [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Received event network-vif-unplugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.836 187223 DEBUG oslo_concurrency.lockutils [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.837 187223 DEBUG oslo_concurrency.lockutils [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.837 187223 DEBUG oslo_concurrency.lockutils [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.838 187223 DEBUG nova.compute.manager [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] No waiting events found dispatching network-vif-unplugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:59:35 np0005535656 nova_compute[187219]: 2025-11-25 18:59:35.838 187223 DEBUG nova.compute.manager [req-d6b06392-24f5-48a2-9ff1-03571a37766a req-c44102d2-454b-4eaa-8d0a-d468ddfae9cf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Received event network-vif-unplugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.185 187223 DEBUG nova.network.neutron [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.209 187223 INFO nova.compute.manager [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Took 0.55 seconds to deallocate network for instance.#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.302 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.302 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.308 187223 DEBUG nova.compute.manager [req-e9af1ad0-05dd-4d0f-ac77-4a7c16eab936 req-d013b3f5-cee3-4764-88f2-3a999b8efa9a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Received event network-vif-deleted-f55e8d40-34a7-47d9-b278-5b1b90791f49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.309 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.333 187223 INFO nova.scheduler.client.report [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Deleted allocations for instance 22cb9435-5d2b-429f-947c-0d6ce107ed06#033[00m
Nov 25 13:59:36 np0005535656 nova_compute[187219]: 2025-11-25 18:59:36.410 187223 DEBUG oslo_concurrency.lockutils [None req-9e82a3b7-c861-4188-948c-0ed58cde428e 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:36 np0005535656 podman[211334]: 2025-11-25 18:59:36.981040448 +0000 UTC m=+0.086651590 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.227 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.228 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.229 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.229 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.229 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.231 187223 INFO nova.compute.manager [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Terminating instance#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.232 187223 DEBUG nova.compute.manager [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 13:59:37 np0005535656 kernel: tap3021cfeb-4e (unregistering): left promiscuous mode
Nov 25 13:59:37 np0005535656 NetworkManager[55548]: <info>  [1764097177.2622] device (tap3021cfeb-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 13:59:37 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:37Z|00074|binding|INFO|Releasing lport 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa from this chassis (sb_readonly=0)
Nov 25 13:59:37 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:37Z|00075|binding|INFO|Setting lport 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa down in Southbound
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.264 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 ovn_controller[95460]: 2025-11-25T18:59:37Z|00076|binding|INFO|Removing iface tap3021cfeb-4e ovn-installed in OVS
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.268 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.274 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:5b:c3 10.100.0.3'], port_security=['fa:16:3e:1c:5b:c3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c58d5fa6c9449d0ade4f6e196f5da2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a99ea8d-0bac-44ff-8604-506502702def', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61f992e8-7ad3-4f4f-9d1f-85cebb75b26e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.276 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa in datapath ec24c862-d31e-4059-8ccd-fa96e33dc558 unbound from our chassis#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.279 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec24c862-d31e-4059-8ccd-fa96e33dc558, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.280 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e342dae8-73bb-46ea-9cc0-be7d146f79c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.281 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558 namespace which is not needed anymore#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.286 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 25 13:59:37 np0005535656 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 19.247s CPU time.
Nov 25 13:59:37 np0005535656 systemd-machined[153481]: Machine qemu-5-instance-00000007 terminated.
Nov 25 13:59:37 np0005535656 NetworkManager[55548]: <info>  [1764097177.4634] manager: (tap3021cfeb-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.511 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [NOTICE]   (210764) : haproxy version is 2.8.14-c23fe91
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [NOTICE]   (210764) : path to executable is /usr/sbin/haproxy
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [WARNING]  (210764) : Exiting Master process...
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [WARNING]  (210764) : Exiting Master process...
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [ALERT]    (210764) : Current worker (210766) exited with code 143 (Terminated)
Nov 25 13:59:37 np0005535656 neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558[210760]: [WARNING]  (210764) : All workers exited. Exiting... (0)
Nov 25 13:59:37 np0005535656 systemd[1]: libpod-6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c.scope: Deactivated successfully.
Nov 25 13:59:37 np0005535656 podman[211377]: 2025-11-25 18:59:37.524244566 +0000 UTC m=+0.087469812 container died 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.546 187223 INFO nova.virt.libvirt.driver [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Instance destroyed successfully.#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.547 187223 DEBUG nova.objects.instance [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lazy-loading 'resources' on Instance uuid bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 13:59:37 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c-userdata-shm.mount: Deactivated successfully.
Nov 25 13:59:37 np0005535656 systemd[1]: var-lib-containers-storage-overlay-ae13029a561b7faf15c5f2a9ef0964a6ef02f98c7fa69c812cf5f3c9e7809d3a-merged.mount: Deactivated successfully.
Nov 25 13:59:37 np0005535656 podman[211377]: 2025-11-25 18:59:37.568233293 +0000 UTC m=+0.131458519 container cleanup 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.572 187223 DEBUG nova.virt.libvirt.vif [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T18:57:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1978486513',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1978486513',id=7,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T18:57:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c58d5fa6c9449d0ade4f6e196f5da2b',ramdisk_id='',reservation_id='r-0b3zgyo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1744823125',owner_user_name='tempest-TestExecuteBasicStrategy-1744823125-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T18:57:51Z,user_data=None,user_id='752e3dfa795e4fd781c1bbb04a2f8e22',uuid=bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.573 187223 DEBUG nova.network.os_vif_util [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converting VIF {"id": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "address": "fa:16:3e:1c:5b:c3", "network": {"id": "ec24c862-d31e-4059-8ccd-fa96e33dc558", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-540533727-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c58d5fa6c9449d0ade4f6e196f5da2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3021cfeb-4e", "ovs_interfaceid": "3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.575 187223 DEBUG nova.network.os_vif_util [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.575 187223 DEBUG os_vif [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.579 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.580 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3021cfeb-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.583 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.585 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 systemd[1]: libpod-conmon-6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c.scope: Deactivated successfully.
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.589 187223 INFO os_vif [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5b:c3,bridge_name='br-int',has_traffic_filtering=True,id=3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa,network=Network(ec24c862-d31e-4059-8ccd-fa96e33dc558),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3021cfeb-4e')#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.590 187223 INFO nova.virt.libvirt.driver [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Deleting instance files /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6_del#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.590 187223 INFO nova.virt.libvirt.driver [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Deletion of /var/lib/nova/instances/bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6_del complete#033[00m
Nov 25 13:59:37 np0005535656 podman[211422]: 2025-11-25 18:59:37.646994101 +0000 UTC m=+0.050789540 container remove 6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.650 187223 INFO nova.compute.manager [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.651 187223 DEBUG oslo.service.loopingcall [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.651 187223 DEBUG nova.compute.manager [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.651 187223 DEBUG nova.network.neutron [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.652 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[227f0546-7b20-4b8a-a742-c1163d7b7ded]: (4, ('Tue Nov 25 06:59:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558 (6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c)\n6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c\nTue Nov 25 06:59:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558 (6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c)\n6a432782c6c1eedc0e94f84fc109cb8cf768b8db0c2558a94f0fee478142ea1c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.654 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c915b297-3b52-425c-8e6c-b7a6babf48d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.656 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec24c862-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.658 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 kernel: tapec24c862-d0: left promiscuous mode
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.671 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.674 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e8db95-96e5-4662-8d43-56fe4f687b47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.687 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c0189db8-b5c2-472f-b811-060df310ddf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.688 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[23547c04-911f-4e52-b459-614385a718e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.707 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e8875ccd-ff8c-47c0-b6f2-3e75758f1d3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403114, 'reachable_time': 42339, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211437, 'error': None, 'target': 'ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.710 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec24c862-d31e-4059-8ccd-fa96e33dc558 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 13:59:37 np0005535656 systemd[1]: run-netns-ovnmeta\x2dec24c862\x2dd31e\x2d4059\x2d8ccd\x2dfa96e33dc558.mount: Deactivated successfully.
Nov 25 13:59:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:37.711 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[0616b692-54f0-48ff-9bd5-cdd9a3e72a4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.929 187223 DEBUG nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Received event network-vif-plugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.930 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.930 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.931 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "22cb9435-5d2b-429f-947c-0d6ce107ed06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.931 187223 DEBUG nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] No waiting events found dispatching network-vif-plugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.932 187223 WARNING nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Received unexpected event network-vif-plugged-f55e8d40-34a7-47d9-b278-5b1b90791f49 for instance with vm_state deleted and task_state None.#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.932 187223 DEBUG nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-unplugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.933 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.933 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.934 187223 DEBUG oslo_concurrency.lockutils [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.934 187223 DEBUG nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] No waiting events found dispatching network-vif-unplugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:59:37 np0005535656 nova_compute[187219]: 2025-11-25 18:59:37.934 187223 DEBUG nova.compute.manager [req-a89e339d-de74-4020-bd8b-de8d37b3a37b req-8fce97d0-67d1-4812-80e6-52823ca361f7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-unplugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.172 187223 DEBUG nova.network.neutron [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.191 187223 INFO nova.compute.manager [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Took 0.54 seconds to deallocate network for instance.#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.236 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.237 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.299 187223 DEBUG nova.compute.provider_tree [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.323 187223 DEBUG nova.scheduler.client.report [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.351 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.386 187223 DEBUG nova.compute.manager [req-14e61c2c-5bd0-4369-a2ac-14e085f5487f req-3703b5eb-c317-4175-91b2-4ca8c25ee169 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-deleted-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.399 187223 INFO nova.scheduler.client.report [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Deleted allocations for instance bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6#033[00m
Nov 25 13:59:38 np0005535656 nova_compute[187219]: 2025-11-25 18:59:38.509 187223 DEBUG oslo_concurrency.lockutils [None req-e472e53d-7789-4c93-a6bf-2f6934ce4e09 752e3dfa795e4fd781c1bbb04a2f8e22 5c58d5fa6c9449d0ade4f6e196f5da2b - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:39 np0005535656 podman[211438]: 2025-11-25 18:59:39.975923079 +0000 UTC m=+0.090364969 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.009 187223 DEBUG nova.compute.manager [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.009 187223 DEBUG oslo_concurrency.lockutils [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.010 187223 DEBUG oslo_concurrency.lockutils [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.010 187223 DEBUG oslo_concurrency.lockutils [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.011 187223 DEBUG nova.compute.manager [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] No waiting events found dispatching network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.011 187223 WARNING nova.compute.manager [req-acd7ac26-bbbf-4aff-9ac8-c11be40f618b req-9c732757-bb33-4488-a8eb-d3e096ad67be 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Received unexpected event network-vif-plugged-3021cfeb-4e78-4348-bec2-8dd6ec2d5cfa for instance with vm_state deleted and task_state None.#033[00m
Nov 25 13:59:40 np0005535656 nova_compute[187219]: 2025-11-25 18:59:40.187 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:42 np0005535656 nova_compute[187219]: 2025-11-25 18:59:42.584 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:45 np0005535656 nova_compute[187219]: 2025-11-25 18:59:45.188 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:47 np0005535656 nova_compute[187219]: 2025-11-25 18:59:47.587 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: ERROR   18:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 13:59:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 13:59:49 np0005535656 nova_compute[187219]: 2025-11-25 18:59:49.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:49 np0005535656 nova_compute[187219]: 2025-11-25 18:59:49.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:50 np0005535656 nova_compute[187219]: 2025-11-25 18:59:50.206 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:50 np0005535656 nova_compute[187219]: 2025-11-25 18:59:50.572 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097175.5715156, 22cb9435-5d2b-429f-947c-0d6ce107ed06 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:59:50 np0005535656 nova_compute[187219]: 2025-11-25 18:59:50.573 187223 INFO nova.compute.manager [-] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:59:50 np0005535656 nova_compute[187219]: 2025-11-25 18:59:50.592 187223 DEBUG nova.compute.manager [None req-85476a4f-0074-4ee9-9af3-944b1e235c8b - - - - - -] [instance: 22cb9435-5d2b-429f-947c-0d6ce107ed06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:59:50 np0005535656 nova_compute[187219]: 2025-11-25 18:59:50.680 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:51 np0005535656 nova_compute[187219]: 2025-11-25 18:59:51.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:51 np0005535656 nova_compute[187219]: 2025-11-25 18:59:51.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 13:59:51 np0005535656 podman[211459]: 2025-11-25 18:59:51.93697766 +0000 UTC m=+0.059604687 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.544 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097177.54406, bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.545 187223 INFO nova.compute.manager [-] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] VM Stopped (Lifecycle Event)#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.574 187223 DEBUG nova.compute.manager [None req-6042f694-656d-4897-a10b-7929ae054c4f - - - - - -] [instance: bdb24ea0-e9ca-4335-a48c-c073f8b6a2c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.590 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.689 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.690 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.690 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.710 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.711 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.711 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.712 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 13:59:52 np0005535656 nova_compute[187219]: 2025-11-25 18:59:52.730 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 13:59:55 np0005535656 nova_compute[187219]: 2025-11-25 18:59:55.245 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:55 np0005535656 nova_compute[187219]: 2025-11-25 18:59:55.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:56 np0005535656 nova_compute[187219]: 2025-11-25 18:59:56.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:56 np0005535656 nova_compute[187219]: 2025-11-25 18:59:56.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:56 np0005535656 nova_compute[187219]: 2025-11-25 18:59:56.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 13:59:57 np0005535656 nova_compute[187219]: 2025-11-25 18:59:57.594 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.703 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.705 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.979 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.981 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5881MB free_disk=73.1639404296875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.982 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:58 np0005535656 nova_compute[187219]: 2025-11-25 18:59:58.982 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:59.075 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 13:59:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:59.076 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 13:59:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 18:59:59.076 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.153 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.153 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.196 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.211 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.234 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 13:59:59 np0005535656 nova_compute[187219]: 2025-11-25 18:59:59.235 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 13:59:59 np0005535656 podman[211485]: 2025-11-25 18:59:59.973882249 +0000 UTC m=+0.080564246 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:00:00 np0005535656 podman[211484]: 2025-11-25 19:00:00.028150432 +0000 UTC m=+0.140758069 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:00:00 np0005535656 nova_compute[187219]: 2025-11-25 19:00:00.248 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:02 np0005535656 nova_compute[187219]: 2025-11-25 19:00:02.235 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:02 np0005535656 nova_compute[187219]: 2025-11-25 19:00:02.597 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:05 np0005535656 nova_compute[187219]: 2025-11-25 19:00:05.249 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:05 np0005535656 podman[197580]: time="2025-11-25T19:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:00:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:00:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 25 14:00:07 np0005535656 nova_compute[187219]: 2025-11-25 19:00:07.604 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:07 np0005535656 podman[211530]: 2025-11-25 19:00:07.948679417 +0000 UTC m=+0.066843080 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, distribution-scope=public)
Nov 25 14:00:08 np0005535656 ovn_controller[95460]: 2025-11-25T19:00:08Z|00077|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 25 14:00:10 np0005535656 nova_compute[187219]: 2025-11-25 19:00:10.251 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:10 np0005535656 podman[211552]: 2025-11-25 19:00:10.981532954 +0000 UTC m=+0.094557212 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 14:00:12 np0005535656 nova_compute[187219]: 2025-11-25 19:00:12.608 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:15 np0005535656 nova_compute[187219]: 2025-11-25 19:00:15.254 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:17 np0005535656 nova_compute[187219]: 2025-11-25 19:00:17.611 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:19 np0005535656 nova_compute[187219]: 2025-11-25 19:00:19.044 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:00:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:00:20 np0005535656 nova_compute[187219]: 2025-11-25 19:00:20.255 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:22 np0005535656 nova_compute[187219]: 2025-11-25 19:00:22.615 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:22 np0005535656 podman[211574]: 2025-11-25 19:00:22.935177316 +0000 UTC m=+0.060808187 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:00:25 np0005535656 nova_compute[187219]: 2025-11-25 19:00:25.310 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:26.586 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:00:26 np0005535656 nova_compute[187219]: 2025-11-25 19:00:26.586 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:26.588 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:00:27 np0005535656 nova_compute[187219]: 2025-11-25 19:00:27.657 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:30 np0005535656 nova_compute[187219]: 2025-11-25 19:00:30.312 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:30 np0005535656 podman[211599]: 2025-11-25 19:00:30.945534618 +0000 UTC m=+0.068569006 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:00:31 np0005535656 podman[211598]: 2025-11-25 19:00:31.003766257 +0000 UTC m=+0.122797488 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 25 14:00:32 np0005535656 nova_compute[187219]: 2025-11-25 19:00:32.704 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:35 np0005535656 nova_compute[187219]: 2025-11-25 19:00:35.356 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:35.591 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:00:35 np0005535656 podman[197580]: time="2025-11-25T19:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:00:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:00:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Nov 25 14:00:37 np0005535656 nova_compute[187219]: 2025-11-25 19:00:37.707 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:38 np0005535656 podman[211644]: 2025-11-25 19:00:38.960709596 +0000 UTC m=+0.078310636 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 14:00:40 np0005535656 nova_compute[187219]: 2025-11-25 19:00:40.358 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:42 np0005535656 podman[211667]: 2025-11-25 19:00:42.018851111 +0000 UTC m=+0.123608839 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:00:42 np0005535656 nova_compute[187219]: 2025-11-25 19:00:42.711 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:45 np0005535656 nova_compute[187219]: 2025-11-25 19:00:45.360 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:47 np0005535656 nova_compute[187219]: 2025-11-25 19:00:47.714 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:00:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:00:49 np0005535656 nova_compute[187219]: 2025-11-25 19:00:49.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:50 np0005535656 nova_compute[187219]: 2025-11-25 19:00:50.362 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:51 np0005535656 nova_compute[187219]: 2025-11-25 19:00:51.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:52 np0005535656 nova_compute[187219]: 2025-11-25 19:00:52.761 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:53 np0005535656 nova_compute[187219]: 2025-11-25 19:00:53.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:53 np0005535656 nova_compute[187219]: 2025-11-25 19:00:53.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:00:53 np0005535656 nova_compute[187219]: 2025-11-25 19:00:53.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:00:53 np0005535656 nova_compute[187219]: 2025-11-25 19:00:53.694 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:00:53 np0005535656 podman[211687]: 2025-11-25 19:00:53.992473839 +0000 UTC m=+0.110002166 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:00:54 np0005535656 ovn_controller[95460]: 2025-11-25T19:00:54Z|00078|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 14:00:54 np0005535656 nova_compute[187219]: 2025-11-25 19:00:54.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:55 np0005535656 nova_compute[187219]: 2025-11-25 19:00:55.372 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:55 np0005535656 nova_compute[187219]: 2025-11-25 19:00:55.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:55 np0005535656 nova_compute[187219]: 2025-11-25 19:00:55.690 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:57 np0005535656 nova_compute[187219]: 2025-11-25 19:00:57.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:57 np0005535656 nova_compute[187219]: 2025-11-25 19:00:57.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:00:57 np0005535656 nova_compute[187219]: 2025-11-25 19:00:57.780 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:00:58 np0005535656 nova_compute[187219]: 2025-11-25 19:00:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:59.077 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:00:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:59.078 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:00:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:00:59.078 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.711 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.712 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.712 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.713 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.940 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.942 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5880MB free_disk=73.1639404296875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.942 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:00:59 np0005535656 nova_compute[187219]: 2025-11-25 19:00:59.942 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.187 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.187 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.213 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.228 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.230 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.230 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:00 np0005535656 nova_compute[187219]: 2025-11-25 19:01:00.375 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:01 np0005535656 podman[211714]: 2025-11-25 19:01:01.967579015 +0000 UTC m=+0.076539950 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:01:02 np0005535656 podman[211713]: 2025-11-25 19:01:02.013325248 +0000 UTC m=+0.126791713 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 14:01:02 np0005535656 nova_compute[187219]: 2025-11-25 19:01:02.831 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:03 np0005535656 nova_compute[187219]: 2025-11-25 19:01:03.230 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:05 np0005535656 nova_compute[187219]: 2025-11-25 19:01:05.377 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:05 np0005535656 podman[197580]: time="2025-11-25T19:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:01:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:01:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 25 14:01:07 np0005535656 nova_compute[187219]: 2025-11-25 19:01:07.835 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:09 np0005535656 podman[211769]: 2025-11-25 19:01:09.978321248 +0000 UTC m=+0.091992428 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 14:01:10 np0005535656 nova_compute[187219]: 2025-11-25 19:01:10.380 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:12 np0005535656 nova_compute[187219]: 2025-11-25 19:01:12.838 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:12 np0005535656 podman[211790]: 2025-11-25 19:01:12.980925137 +0000 UTC m=+0.090992730 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:01:15 np0005535656 nova_compute[187219]: 2025-11-25 19:01:15.382 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:17 np0005535656 nova_compute[187219]: 2025-11-25 19:01:17.841 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:01:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:01:20 np0005535656 nova_compute[187219]: 2025-11-25 19:01:20.416 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:22 np0005535656 nova_compute[187219]: 2025-11-25 19:01:22.880 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:24 np0005535656 podman[211810]: 2025-11-25 19:01:24.931651942 +0000 UTC m=+0.056600187 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 14:01:25 np0005535656 nova_compute[187219]: 2025-11-25 19:01:25.456 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:25 np0005535656 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 14:01:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:26.921 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:01:26 np0005535656 nova_compute[187219]: 2025-11-25 19:01:26.922 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:26.923 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:01:27 np0005535656 nova_compute[187219]: 2025-11-25 19:01:27.882 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:30 np0005535656 nova_compute[187219]: 2025-11-25 19:01:30.459 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:32 np0005535656 nova_compute[187219]: 2025-11-25 19:01:32.916 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:32 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:32.925 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:32 np0005535656 podman[211836]: 2025-11-25 19:01:32.976462716 +0000 UTC m=+0.085784916 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:01:33 np0005535656 podman[211835]: 2025-11-25 19:01:33.021260106 +0000 UTC m=+0.145192920 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 14:01:35 np0005535656 nova_compute[187219]: 2025-11-25 19:01:35.509 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:35 np0005535656 podman[197580]: time="2025-11-25T19:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:01:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:01:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.499 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.499 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.513 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.611 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.612 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.622 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.622 187223 INFO nova.compute.claims [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.752 187223 DEBUG nova.compute.provider_tree [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.765 187223 DEBUG nova.scheduler.client.report [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.790 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.791 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.855 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.856 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.882 187223 INFO nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.901 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:01:37 np0005535656 nova_compute[187219]: 2025-11-25 19:01:37.921 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.007 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.009 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.009 187223 INFO nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Creating image(s)#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.010 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.011 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.012 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.035 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.087 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.089 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.089 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.111 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.163 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.164 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.199 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.200 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.201 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.255 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.257 187223 DEBUG nova.virt.disk.api [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Checking if we can resize image /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.258 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.309 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.311 187223 DEBUG nova.virt.disk.api [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Cannot resize image /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.312 187223 DEBUG nova.objects.instance [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lazy-loading 'migration_context' on Instance uuid b77a1129-349d-490e-995b-f3af82cad09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.338 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.338 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Ensure instance console log exists: /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.339 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.340 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.340 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:38 np0005535656 nova_compute[187219]: 2025-11-25 19:01:38.355 187223 DEBUG nova.policy [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95bf8ec32c354fdda02153bae5f135db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b39b7bed94684051ac6ad318bf49c493', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:01:39 np0005535656 nova_compute[187219]: 2025-11-25 19:01:39.085 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Successfully created port: f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:01:39 np0005535656 nova_compute[187219]: 2025-11-25 19:01:39.970 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Successfully updated port: f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:01:39 np0005535656 nova_compute[187219]: 2025-11-25 19:01:39.990 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:01:39 np0005535656 nova_compute[187219]: 2025-11-25 19:01:39.990 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquired lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:01:39 np0005535656 nova_compute[187219]: 2025-11-25 19:01:39.990 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:01:40 np0005535656 nova_compute[187219]: 2025-11-25 19:01:40.165 187223 DEBUG nova.compute.manager [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-changed-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:01:40 np0005535656 nova_compute[187219]: 2025-11-25 19:01:40.165 187223 DEBUG nova.compute.manager [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Refreshing instance network info cache due to event network-changed-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:01:40 np0005535656 nova_compute[187219]: 2025-11-25 19:01:40.166 187223 DEBUG oslo_concurrency.lockutils [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:01:40 np0005535656 nova_compute[187219]: 2025-11-25 19:01:40.510 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:40 np0005535656 nova_compute[187219]: 2025-11-25 19:01:40.711 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:01:40 np0005535656 podman[211893]: 2025-11-25 19:01:40.947864017 +0000 UTC m=+0.067536140 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.259 187223 DEBUG nova.network.neutron [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updating instance_info_cache with network_info: [{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.277 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Releasing lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.277 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Instance network_info: |[{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.277 187223 DEBUG oslo_concurrency.lockutils [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.278 187223 DEBUG nova.network.neutron [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Refreshing network info cache for port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.280 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Start _get_guest_xml network_info=[{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.285 187223 WARNING nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.289 187223 DEBUG nova.virt.libvirt.host [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.290 187223 DEBUG nova.virt.libvirt.host [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.303 187223 DEBUG nova.virt.libvirt.host [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.304 187223 DEBUG nova.virt.libvirt.host [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.306 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.306 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.307 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.308 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.308 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.308 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.309 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.309 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.310 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.310 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.311 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.311 187223 DEBUG nova.virt.hardware [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.318 187223 DEBUG nova.virt.libvirt.vif [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2129992396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2129992396',id=9,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b39b7bed94684051ac6ad318bf49c493',ramdisk_id='',reservation_id='r-ekvb0a6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:01:37Z,user_data=None,user_id='95bf8ec32c354fdda02153bae5f135db',uuid=b77a1129-349d-490e-995b-f3af82cad09d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.319 187223 DEBUG nova.network.os_vif_util [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Converting VIF {"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.320 187223 DEBUG nova.network.os_vif_util [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.322 187223 DEBUG nova.objects.instance [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lazy-loading 'pci_devices' on Instance uuid b77a1129-349d-490e-995b-f3af82cad09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.346 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <uuid>b77a1129-349d-490e-995b-f3af82cad09d</uuid>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <name>instance-00000009</name>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-2129992396</nova:name>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:01:42</nova:creationTime>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:user uuid="95bf8ec32c354fdda02153bae5f135db">tempest-TestExecuteHostMaintenanceStrategy-1934751356-project-member</nova:user>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:project uuid="b39b7bed94684051ac6ad318bf49c493">tempest-TestExecuteHostMaintenanceStrategy-1934751356</nova:project>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        <nova:port uuid="f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="serial">b77a1129-349d-490e-995b-f3af82cad09d</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="uuid">b77a1129-349d-490e-995b-f3af82cad09d</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.config"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:4d:5b:13"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <target dev="tapf21fc782-3d"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/console.log" append="off"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:01:42 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:01:42 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:01:42 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:01:42 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.346 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Preparing to wait for external event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.347 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.347 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.347 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.348 187223 DEBUG nova.virt.libvirt.vif [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2129992396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2129992396',id=9,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b39b7bed94684051ac6ad318bf49c493',ramdisk_id='',reservation_id='r-ekvb0a6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:01:37Z,user_data=None,user_id='95bf8ec32c354fdda02153bae5f135db',uuid=b77a1129-349d-490e-995b-f3af82cad09d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.349 187223 DEBUG nova.network.os_vif_util [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Converting VIF {"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.349 187223 DEBUG nova.network.os_vif_util [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.350 187223 DEBUG os_vif [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.351 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.351 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.352 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.356 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.356 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf21fc782-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.357 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf21fc782-3d, col_values=(('external_ids', {'iface-id': 'f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:5b:13', 'vm-uuid': 'b77a1129-349d-490e-995b-f3af82cad09d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.358 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:42 np0005535656 NetworkManager[55548]: <info>  [1764097302.3598] manager: (tapf21fc782-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.361 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.369 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.370 187223 INFO os_vif [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d')#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.436 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.437 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.437 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] No VIF found with MAC fa:16:3e:4d:5b:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.438 187223 INFO nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Using config drive#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.884 187223 INFO nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Creating config drive at /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.config#033[00m
Nov 25 14:01:42 np0005535656 nova_compute[187219]: 2025-11-25 19:01:42.894 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppz44vsob execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.033 187223 DEBUG oslo_concurrency.processutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppz44vsob" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:01:43 np0005535656 kernel: tapf21fc782-3d: entered promiscuous mode
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.1416] manager: (tapf21fc782-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.143 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:43Z|00079|binding|INFO|Claiming lport f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for this chassis.
Nov 25 14:01:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:43Z|00080|binding|INFO|f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3: Claiming fa:16:3e:4d:5b:13 10.100.0.14
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.150 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.154 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.168 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:5b:13 10.100.0.14'], port_security=['fa:16:3e:4d:5b:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b77a1129-349d-490e-995b-f3af82cad09d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9961241-2a95-422e-8981-de6adb72b948', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b39b7bed94684051ac6ad318bf49c493', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a6c87789-d63c-4704-8c71-a6afbc377432', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5989fccf-0ca1-44db-a2a9-02b00f15645e, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.171 104346 INFO neutron.agent.ovn.metadata.agent [-] Port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 in datapath b9961241-2a95-422e-8981-de6adb72b948 bound to our chassis#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.174 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9961241-2a95-422e-8981-de6adb72b948#033[00m
Nov 25 14:01:43 np0005535656 systemd-machined[153481]: New machine qemu-7-instance-00000009.
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.198 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[98077aaa-56db-4405-9d81-27a91b9995a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.199 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9961241-21 in ovnmeta-b9961241-2a95-422e-8981-de6adb72b948 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.203 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9961241-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.204 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[98c93972-3ba8-4eb7-9ba6-5051bd4b32f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.205 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe31d0e-a3d0-4ec6-9873-9345c3b9eab8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.225 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5a5473-a8e5-4734-9c4e-4b5621e38118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.234 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:43Z|00081|binding|INFO|Setting lport f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 ovn-installed in OVS
Nov 25 14:01:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:43Z|00082|binding|INFO|Setting lport f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 up in Southbound
Nov 25 14:01:43 np0005535656 systemd[1]: Started Virtual Machine qemu-7-instance-00000009.
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.241 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.245 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bfde30b8-feff-46a8-99fc-81b295a160b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 systemd-udevd[211955]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.2678] device (tapf21fc782-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.2691] device (tapf21fc782-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.275 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[605d00c7-858e-4cf7-9246-6cead3058b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 systemd-udevd[211959]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.280 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2f319a0c-14a2-484b-a209-787e1a66ce96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.2821] manager: (tapb9961241-20): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 25 14:01:43 np0005535656 podman[211927]: 2025-11-25 19:01:43.288203864 +0000 UTC m=+0.158244831 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.311 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[a0649ae2-f85b-48bc-86c3-73941734c22e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.315 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[1372567e-f481-4ec5-bef4-63ab7638aa49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.3361] device (tapb9961241-20): carrier: link connected
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.339 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[5714bcd1-4b93-4eed-b567-26f97306a83e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.357 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[62cfda9a-90fa-430e-a51d-25da8c726f79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9961241-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:42:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426486, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211986, 'error': None, 'target': 'ovnmeta-b9961241-2a95-422e-8981-de6adb72b948', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.373 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[53d46cd4-fdde-4496-aa15-5021d58cec75]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:427a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426486, 'tstamp': 426486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211987, 'error': None, 'target': 'ovnmeta-b9961241-2a95-422e-8981-de6adb72b948', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.396 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[997b400a-6d4f-404d-b59f-1b37f1ad41ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9961241-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:42:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426486, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211988, 'error': None, 'target': 'ovnmeta-b9961241-2a95-422e-8981-de6adb72b948', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.432 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca106d4-a426-4411-bcf2-c282df9365ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.482 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[81e9fb98-dfab-4d95-81ec-1198e606e171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.483 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9961241-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.483 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.484 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9961241-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:43 np0005535656 kernel: tapb9961241-20: entered promiscuous mode
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.528 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 NetworkManager[55548]: <info>  [1764097303.5293] manager: (tapb9961241-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.531 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.531 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9961241-20, col_values=(('external_ids', {'iface-id': '163f7bf0-cda9-45b7-ae10-ec3c075f54a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.532 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:43Z|00083|binding|INFO|Releasing lport 163f7bf0-cda9-45b7-ae10-ec3c075f54a0 from this chassis (sb_readonly=0)
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.543 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.544 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9961241-2a95-422e-8981-de6adb72b948.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9961241-2a95-422e-8981-de6adb72b948.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.544 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9b98016b-fbad-4677-bb2c-70009867ea60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.545 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-b9961241-2a95-422e-8981-de6adb72b948
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/b9961241-2a95-422e-8981-de6adb72b948.pid.haproxy
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID b9961241-2a95-422e-8981-de6adb72b948
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:01:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:43.545 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9961241-2a95-422e-8981-de6adb72b948', 'env', 'PROCESS_TAG=haproxy-b9961241-2a95-422e-8981-de6adb72b948', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9961241-2a95-422e-8981-de6adb72b948.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.791 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097303.7911453, b77a1129-349d-490e-995b-f3af82cad09d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.792 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] VM Started (Lifecycle Event)#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.809 187223 DEBUG nova.compute.manager [req-0eec9a2e-32f8-45a4-a796-2a05352df4ba req-ac839a46-3050-494a-845a-ae01d2349522 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.809 187223 DEBUG oslo_concurrency.lockutils [req-0eec9a2e-32f8-45a4-a796-2a05352df4ba req-ac839a46-3050-494a-845a-ae01d2349522 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.810 187223 DEBUG oslo_concurrency.lockutils [req-0eec9a2e-32f8-45a4-a796-2a05352df4ba req-ac839a46-3050-494a-845a-ae01d2349522 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.810 187223 DEBUG oslo_concurrency.lockutils [req-0eec9a2e-32f8-45a4-a796-2a05352df4ba req-ac839a46-3050-494a-845a-ae01d2349522 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.810 187223 DEBUG nova.compute.manager [req-0eec9a2e-32f8-45a4-a796-2a05352df4ba req-ac839a46-3050-494a-845a-ae01d2349522 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Processing event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.810 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.815 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.817 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.819 187223 INFO nova.virt.libvirt.driver [-] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Instance spawned successfully.#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.819 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.822 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.839 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.839 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.839 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.840 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.840 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.840 187223 DEBUG nova.virt.libvirt.driver [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.843 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.843 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097303.791396, b77a1129-349d-490e-995b-f3af82cad09d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.843 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.871 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.874 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097303.8141167, b77a1129-349d-490e-995b-f3af82cad09d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.874 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.894 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.897 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.901 187223 INFO nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Took 5.89 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.901 187223 DEBUG nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.925 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:01:43 np0005535656 podman[212026]: 2025-11-25 19:01:43.958022605 +0000 UTC m=+0.062309726 container create bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.974 187223 INFO nova.compute.manager [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Took 6.39 seconds to build instance.#033[00m
Nov 25 14:01:43 np0005535656 systemd[1]: Started libpod-conmon-bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227.scope.
Nov 25 14:01:43 np0005535656 nova_compute[187219]: 2025-11-25 19:01:43.991 187223 DEBUG oslo_concurrency.lockutils [None req-a0d91aed-4f9f-4fee-beb0-c0cbbe8d3678 95bf8ec32c354fdda02153bae5f135db b39b7bed94684051ac6ad318bf49c493 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:44 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:01:44 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f06309cba73c5355a0ace124276937d7065e637ac214439495c9f66da5e24480/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:01:44 np0005535656 podman[212026]: 2025-11-25 19:01:43.933095845 +0000 UTC m=+0.037382996 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:01:44 np0005535656 podman[212026]: 2025-11-25 19:01:44.038495642 +0000 UTC m=+0.142782763 container init bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 14:01:44 np0005535656 podman[212026]: 2025-11-25 19:01:44.048359545 +0000 UTC m=+0.152646666 container start bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 14:01:44 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [NOTICE]   (212046) : New worker (212048) forked
Nov 25 14:01:44 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [NOTICE]   (212046) : Loading success.
Nov 25 14:01:44 np0005535656 nova_compute[187219]: 2025-11-25 19:01:44.165 187223 DEBUG nova.network.neutron [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updated VIF entry in instance network info cache for port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:01:44 np0005535656 nova_compute[187219]: 2025-11-25 19:01:44.166 187223 DEBUG nova.network.neutron [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updating instance_info_cache with network_info: [{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:01:44 np0005535656 nova_compute[187219]: 2025-11-25 19:01:44.177 187223 DEBUG oslo_concurrency.lockutils [req-5ac60953-c186-4d27-8fcc-4c1467e35351 req-ce843ed5-a81a-4f7e-a7de-769137a1a54b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.515 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.912 187223 DEBUG nova.compute.manager [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.913 187223 DEBUG oslo_concurrency.lockutils [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.913 187223 DEBUG oslo_concurrency.lockutils [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.915 187223 DEBUG oslo_concurrency.lockutils [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.915 187223 DEBUG nova.compute.manager [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:01:45 np0005535656 nova_compute[187219]: 2025-11-25 19:01:45.916 187223 WARNING nova.compute.manager [req-649e39f6-dae0-4231-8197-277510604b03 req-6343f5ce-7512-4021-af25-094ff0ec5ca4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state None.#033[00m
Nov 25 14:01:47 np0005535656 nova_compute[187219]: 2025-11-25 19:01:47.361 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:01:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:01:50 np0005535656 nova_compute[187219]: 2025-11-25 19:01:50.560 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:50 np0005535656 nova_compute[187219]: 2025-11-25 19:01:50.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:52 np0005535656 nova_compute[187219]: 2025-11-25 19:01:52.364 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:53 np0005535656 nova_compute[187219]: 2025-11-25 19:01:53.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:53 np0005535656 nova_compute[187219]: 2025-11-25 19:01:53.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:53 np0005535656 nova_compute[187219]: 2025-11-25 19:01:53.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:01:53 np0005535656 nova_compute[187219]: 2025-11-25 19:01:53.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:01:54 np0005535656 nova_compute[187219]: 2025-11-25 19:01:54.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:01:54 np0005535656 nova_compute[187219]: 2025-11-25 19:01:54.709 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:01:54 np0005535656 nova_compute[187219]: 2025-11-25 19:01:54.710 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:01:54 np0005535656 nova_compute[187219]: 2025-11-25 19:01:54.710 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid b77a1129-349d-490e-995b-f3af82cad09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:01:55 np0005535656 nova_compute[187219]: 2025-11-25 19:01:55.561 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:55 np0005535656 podman[212061]: 2025-11-25 19:01:55.954820686 +0000 UTC m=+0.074875283 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.367 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:01:57 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:57Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4d:5b:13 10.100.0.14
Nov 25 14:01:57 np0005535656 ovn_controller[95460]: 2025-11-25T19:01:57Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4d:5b:13 10.100.0.14
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.633 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updating instance_info_cache with network_info: [{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.665 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.665 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.666 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:57 np0005535656 nova_compute[187219]: 2025-11-25 19:01:57.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:01:58 np0005535656 nova_compute[187219]: 2025-11-25 19:01:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:01:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:59.078 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:01:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:01:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:01:59.081 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.564 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.705 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.705 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.706 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.706 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.786 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.840 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.841 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:02:00 np0005535656 nova_compute[187219]: 2025-11-25 19:02:00.929 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.121 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.123 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5716MB free_disk=73.13484573364258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.123 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.124 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.238 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance b77a1129-349d-490e-995b-f3af82cad09d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.238 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.238 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.288 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.325 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.350 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:02:01 np0005535656 nova_compute[187219]: 2025-11-25 19:02:01.350 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:02 np0005535656 nova_compute[187219]: 2025-11-25 19:02:02.369 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:03 np0005535656 podman[212104]: 2025-11-25 19:02:03.977031706 +0000 UTC m=+0.072442787 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:02:04 np0005535656 podman[212103]: 2025-11-25 19:02:04.018572005 +0000 UTC m=+0.119575041 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 25 14:02:05 np0005535656 nova_compute[187219]: 2025-11-25 19:02:05.352 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:05 np0005535656 nova_compute[187219]: 2025-11-25 19:02:05.566 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:05 np0005535656 podman[197580]: time="2025-11-25T19:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:02:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:02:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 25 14:02:07 np0005535656 nova_compute[187219]: 2025-11-25 19:02:07.413 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:10 np0005535656 nova_compute[187219]: 2025-11-25 19:02:10.568 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:12 np0005535656 podman[212146]: 2025-11-25 19:02:12.002545725 +0000 UTC m=+0.116036653 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, release=1755695350, version=9.6, distribution-scope=public)
Nov 25 14:02:12 np0005535656 nova_compute[187219]: 2025-11-25 19:02:12.333 187223 DEBUG nova.compute.manager [None req-cddec3bc-995f-43b3-bf3d-c53e2b4faf7d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610#033[00m
Nov 25 14:02:12 np0005535656 nova_compute[187219]: 2025-11-25 19:02:12.378 187223 DEBUG nova.compute.provider_tree [None req-cddec3bc-995f-43b3-bf3d-c53e2b4faf7d fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 12 to 15 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:02:12 np0005535656 nova_compute[187219]: 2025-11-25 19:02:12.416 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:13 np0005535656 podman[212167]: 2025-11-25 19:02:13.963679388 +0000 UTC m=+0.070483872 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 14:02:15 np0005535656 nova_compute[187219]: 2025-11-25 19:02:15.569 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:17 np0005535656 nova_compute[187219]: 2025-11-25 19:02:17.419 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:17 np0005535656 nova_compute[187219]: 2025-11-25 19:02:17.447 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Check if temp file /var/lib/nova/instances/tmpva5fe6o0 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:02:17 np0005535656 nova_compute[187219]: 2025-11-25 19:02:17.447 187223 DEBUG nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpva5fe6o0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b77a1129-349d-490e-995b-f3af82cad09d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:02:18 np0005535656 nova_compute[187219]: 2025-11-25 19:02:18.686 187223 DEBUG oslo_concurrency.processutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:02:18 np0005535656 nova_compute[187219]: 2025-11-25 19:02:18.757 187223 DEBUG oslo_concurrency.processutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:02:18 np0005535656 nova_compute[187219]: 2025-11-25 19:02:18.758 187223 DEBUG oslo_concurrency.processutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:02:18 np0005535656 nova_compute[187219]: 2025-11-25 19:02:18.816 187223 DEBUG oslo_concurrency.processutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:02:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:02:20 np0005535656 nova_compute[187219]: 2025-11-25 19:02:20.571 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:22 np0005535656 nova_compute[187219]: 2025-11-25 19:02:22.422 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:22 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:02:22 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:02:22 np0005535656 systemd-logind[788]: New session 30 of user nova.
Nov 25 14:02:22 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:02:22 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:02:22 np0005535656 systemd[212196]: Queued start job for default target Main User Target.
Nov 25 14:02:22 np0005535656 systemd[212196]: Created slice User Application Slice.
Nov 25 14:02:22 np0005535656 systemd[212196]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:02:22 np0005535656 systemd[212196]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:02:22 np0005535656 systemd[212196]: Reached target Paths.
Nov 25 14:02:22 np0005535656 systemd[212196]: Reached target Timers.
Nov 25 14:02:22 np0005535656 systemd[212196]: Starting D-Bus User Message Bus Socket...
Nov 25 14:02:22 np0005535656 systemd[212196]: Starting Create User's Volatile Files and Directories...
Nov 25 14:02:22 np0005535656 systemd[212196]: Finished Create User's Volatile Files and Directories.
Nov 25 14:02:22 np0005535656 systemd[212196]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:02:22 np0005535656 systemd[212196]: Reached target Sockets.
Nov 25 14:02:22 np0005535656 systemd[212196]: Reached target Basic System.
Nov 25 14:02:22 np0005535656 systemd[212196]: Reached target Main User Target.
Nov 25 14:02:22 np0005535656 systemd[212196]: Startup finished in 175ms.
Nov 25 14:02:22 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:02:22 np0005535656 systemd[1]: Started Session 30 of User nova.
Nov 25 14:02:23 np0005535656 systemd[1]: session-30.scope: Deactivated successfully.
Nov 25 14:02:23 np0005535656 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Nov 25 14:02:23 np0005535656 systemd-logind[788]: Removed session 30.
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.345 187223 DEBUG nova.compute.manager [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.347 187223 DEBUG oslo_concurrency.lockutils [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.348 187223 DEBUG oslo_concurrency.lockutils [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.349 187223 DEBUG oslo_concurrency.lockutils [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.350 187223 DEBUG nova.compute.manager [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:24 np0005535656 nova_compute[187219]: 2025-11-25 19:02:24.350 187223 DEBUG nova.compute.manager [req-88c57521-8557-4707-9bde-e5f16195e289 req-e6cea03a-a25f-4c29-b699-c2e5a8869b5d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.149 187223 INFO nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Took 6.33 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.150 187223 DEBUG nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.170 187223 DEBUG nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpva5fe6o0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b77a1129-349d-490e-995b-f3af82cad09d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d94c7009-36d0-4569-a760-0ef449ec6030),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.195 187223 DEBUG nova.objects.instance [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid b77a1129-349d-490e-995b-f3af82cad09d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.197 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.198 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.198 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.214 187223 DEBUG nova.virt.libvirt.vif [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2129992396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2129992396',id=9,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:01:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b39b7bed94684051ac6ad318bf49c493',ramdisk_id='',reservation_id='r-ekvb0a6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:01:43Z,user_data=None,user_id='95bf8ec32c354fdda02153bae5f135db',uuid=b77a1129-349d-490e-995b-f3af82cad09d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.214 187223 DEBUG nova.network.os_vif_util [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.215 187223 DEBUG nova.network.os_vif_util [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.215 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:02:25 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:4d:5b:13"/>
Nov 25 14:02:25 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:02:25 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:02:25 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:02:25 np0005535656 nova_compute[187219]:  <target dev="tapf21fc782-3d"/>
Nov 25 14:02:25 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:02:25 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.216 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.574 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.701 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.702 187223 INFO nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:02:25 np0005535656 nova_compute[187219]: 2025-11-25 19:02:25.801 187223 INFO nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:02:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:02:26Z|00084|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.306 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.307 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.419 187223 DEBUG nova.compute.manager [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.419 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.419 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.419 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.420 187223 DEBUG nova.compute.manager [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.420 187223 WARNING nova.compute.manager [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.420 187223 DEBUG nova.compute.manager [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-changed-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.420 187223 DEBUG nova.compute.manager [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Refreshing instance network info cache due to event network-changed-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.421 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.421 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.421 187223 DEBUG nova.network.neutron [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Refreshing network info cache for port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.803 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097346.8031096, b77a1129-349d-490e-995b-f3af82cad09d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.804 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.834 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.835 187223 DEBUG nova.virt.libvirt.migration [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.843 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.848 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.865 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:02:26 np0005535656 kernel: tapf21fc782-3d (unregistering): left promiscuous mode
Nov 25 14:02:26 np0005535656 NetworkManager[55548]: <info>  [1764097346.9645] device (tapf21fc782-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.975 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:02:26Z|00085|binding|INFO|Releasing lport f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 from this chassis (sb_readonly=0)
Nov 25 14:02:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:02:26Z|00086|binding|INFO|Setting lport f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 down in Southbound
Nov 25 14:02:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:02:26Z|00087|binding|INFO|Removing iface tapf21fc782-3d ovn-installed in OVS
Nov 25 14:02:26 np0005535656 nova_compute[187219]: 2025-11-25 19:02:26.979 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:26 np0005535656 podman[212223]: 2025-11-25 19:02:26.987483385 +0000 UTC m=+0.094940750 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.001 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:5b:13 10.100.0.14'], port_security=['fa:16:3e:4d:5b:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b77a1129-349d-490e-995b-f3af82cad09d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9961241-2a95-422e-8981-de6adb72b948', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b39b7bed94684051ac6ad318bf49c493', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a6c87789-d63c-4704-8c71-a6afbc377432', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5989fccf-0ca1-44db-a2a9-02b00f15645e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.002 104346 INFO neutron.agent.ovn.metadata.agent [-] Port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 in datapath b9961241-2a95-422e-8981-de6adb72b948 unbound from our chassis#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.003 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9961241-2a95-422e-8981-de6adb72b948, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.005 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4b19d1ae-a854-4613-8861-c87ca74b6b73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.007 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9961241-2a95-422e-8981-de6adb72b948 namespace which is not needed anymore#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.011 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:27 np0005535656 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 25 14:02:27 np0005535656 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Consumed 15.319s CPU time.
Nov 25 14:02:27 np0005535656 systemd-machined[153481]: Machine qemu-7-instance-00000009 terminated.
Nov 25 14:02:27 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [NOTICE]   (212046) : haproxy version is 2.8.14-c23fe91
Nov 25 14:02:27 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [NOTICE]   (212046) : path to executable is /usr/sbin/haproxy
Nov 25 14:02:27 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [WARNING]  (212046) : Exiting Master process...
Nov 25 14:02:27 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [ALERT]    (212046) : Current worker (212048) exited with code 143 (Terminated)
Nov 25 14:02:27 np0005535656 neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948[212042]: [WARNING]  (212046) : All workers exited. Exiting... (0)
Nov 25 14:02:27 np0005535656 systemd[1]: libpod-bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227.scope: Deactivated successfully.
Nov 25 14:02:27 np0005535656 conmon[212042]: conmon bd794b8b506bdae5deb9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227.scope/container/memory.events
Nov 25 14:02:27 np0005535656 podman[212273]: 2025-11-25 19:02:27.148645685 +0000 UTC m=+0.048617367 container died bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 14:02:27 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227-userdata-shm.mount: Deactivated successfully.
Nov 25 14:02:27 np0005535656 systemd[1]: var-lib-containers-storage-overlay-f06309cba73c5355a0ace124276937d7065e637ac214439495c9f66da5e24480-merged.mount: Deactivated successfully.
Nov 25 14:02:27 np0005535656 podman[212273]: 2025-11-25 19:02:27.204932083 +0000 UTC m=+0.104903795 container cleanup bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.218 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.219 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.219 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:02:27 np0005535656 systemd[1]: libpod-conmon-bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227.scope: Deactivated successfully.
Nov 25 14:02:27 np0005535656 podman[212318]: 2025-11-25 19:02:27.290589744 +0000 UTC m=+0.053103690 container remove bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.299 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a64e75c1-3e3e-4b2f-8d1d-6cdb96876ca0]: (4, ('Tue Nov 25 07:02:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948 (bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227)\nbd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227\nTue Nov 25 07:02:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b9961241-2a95-422e-8981-de6adb72b948 (bd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227)\nbd794b8b506bdae5deb9395e761367c39fb569ec11c5ff19b0416ba1ff6af227\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.301 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bb067a-fe48-4985-a4da-e72ac937f02a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.302 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9961241-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.303 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:27 np0005535656 kernel: tapb9961241-20: left promiscuous mode
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.319 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.320 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[aa966664-2801-43eb-b6d8-4492bb5e3583]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.336 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5a5b07-6c51-4e29-8b7c-96f473d8e5f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.337 187223 DEBUG nova.virt.libvirt.guest [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'b77a1129-349d-490e-995b-f3af82cad09d' (instance-00000009) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.337 187223 INFO nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migration operation has completed#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.337 187223 INFO nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] _post_live_migration() is started..#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.338 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[719addcc-6d4d-416f-b4e9-1074c1dbd52f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.360 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[22902897-1e54-4925-b2fc-1b4cd3172798]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426479, 'reachable_time': 35745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212336, 'error': None, 'target': 'ovnmeta-b9961241-2a95-422e-8981-de6adb72b948', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 systemd[1]: run-netns-ovnmeta\x2db9961241\x2d2a95\x2d422e\x2d8981\x2dde6adb72b948.mount: Deactivated successfully.
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.364 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9961241-2a95-422e-8981-de6adb72b948 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.364 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c479ef-780b-4c50-aa98-8d393b118f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.424 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.533 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.534 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:27.535 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.788 187223 DEBUG nova.network.neutron [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updated VIF entry in instance network info cache for port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.788 187223 DEBUG nova.network.neutron [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Updating instance_info_cache with network_info: [{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:02:27 np0005535656 nova_compute[187219]: 2025-11-25 19:02:27.810 187223 DEBUG oslo_concurrency.lockutils [req-23fe728b-4bbc-47bb-a465-4146713e60bc req-485bd1cb-7fe0-4abf-b364-1b42ea0ebbb5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-b77a1129-349d-490e-995b-f3af82cad09d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.097 187223 DEBUG nova.network.neutron [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.098 187223 DEBUG nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.099 187223 DEBUG nova.virt.libvirt.vif [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2129992396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2129992396',id=9,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:01:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b39b7bed94684051ac6ad318bf49c493',ramdisk_id='',reservation_id='r-ekvb0a6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1934751356-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:02:15Z,user_data=None,user_id='95bf8ec32c354fdda02153bae5f135db',uuid=b77a1129-349d-490e-995b-f3af82cad09d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.099 187223 DEBUG nova.network.os_vif_util [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "address": "fa:16:3e:4d:5b:13", "network": {"id": "b9961241-2a95-422e-8981-de6adb72b948", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-372111329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b39b7bed94684051ac6ad318bf49c493", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf21fc782-3d", "ovs_interfaceid": "f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.100 187223 DEBUG nova.network.os_vif_util [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.100 187223 DEBUG os_vif [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.101 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.102 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf21fc782-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.103 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.105 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.107 187223 INFO os_vif [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:5b:13,bridge_name='br-int',has_traffic_filtering=True,id=f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3,network=Network(b9961241-2a95-422e-8981-de6adb72b948),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf21fc782-3d')#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.107 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.107 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.108 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.108 187223 DEBUG nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.108 187223 INFO nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Deleting instance files /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d_del#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.109 187223 INFO nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Deletion of /var/lib/nova/instances/b77a1129-349d-490e-995b-f3af82cad09d_del complete#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.508 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.508 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.508 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.509 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.509 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.509 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.509 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.510 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.510 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.510 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.510 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.510 187223 WARNING nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.511 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.511 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.511 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.511 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.512 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.512 187223 WARNING nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.512 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.512 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.512 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.513 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.513 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.513 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-unplugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.513 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.513 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.514 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.514 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.514 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.515 187223 WARNING nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.515 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.515 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.515 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.516 187223 DEBUG oslo_concurrency.lockutils [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.516 187223 DEBUG nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] No waiting events found dispatching network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:02:28 np0005535656 nova_compute[187219]: 2025-11-25 19:02:28.516 187223 WARNING nova.compute.manager [req-84d15a79-bdcb-405c-81aa-1cdab80a2ab2 req-d05401f1-7b3d-4a00-bf89-77dd5f511391 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Received unexpected event network-vif-plugged-f21fc782-3d1f-44ba-ad7d-a8ffbaf9ebf3 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:02:30 np0005535656 nova_compute[187219]: 2025-11-25 19:02:30.576 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:31.538 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:02:33 np0005535656 nova_compute[187219]: 2025-11-25 19:02:33.105 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:33 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:02:33 np0005535656 systemd[212196]: Activating special unit Exit the Session...
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped target Main User Target.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped target Basic System.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped target Paths.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped target Sockets.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped target Timers.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:02:33 np0005535656 systemd[212196]: Closed D-Bus User Message Bus Socket.
Nov 25 14:02:33 np0005535656 systemd[212196]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:02:33 np0005535656 systemd[212196]: Removed slice User Application Slice.
Nov 25 14:02:33 np0005535656 systemd[212196]: Reached target Shutdown.
Nov 25 14:02:33 np0005535656 systemd[212196]: Finished Exit the Session.
Nov 25 14:02:33 np0005535656 systemd[212196]: Reached target Exit the Session.
Nov 25 14:02:33 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:02:33 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:02:33 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:02:33 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:02:33 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:02:33 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:02:33 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:02:34 np0005535656 podman[212339]: 2025-11-25 19:02:34.953499907 +0000 UTC m=+0.066087470 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:02:34 np0005535656 podman[212338]: 2025-11-25 19:02:34.981322098 +0000 UTC m=+0.097310774 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 14:02:35 np0005535656 nova_compute[187219]: 2025-11-25 19:02:35.577 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:35 np0005535656 podman[197580]: time="2025-11-25T19:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:02:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:02:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.564 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "b77a1129-349d-490e-995b-f3af82cad09d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.564 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.565 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "b77a1129-349d-490e-995b-f3af82cad09d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.587 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.587 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.588 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.588 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.777 187223 WARNING nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.778 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5884MB free_disk=73.16384506225586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.778 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.778 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.828 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance b77a1129-349d-490e-995b-f3af82cad09d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.853 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.880 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration d94c7009-36d0-4569-a760-0ef449ec6030 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.880 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.880 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.917 187223 DEBUG nova.compute.provider_tree [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.940 187223 DEBUG nova.scheduler.client.report [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.960 187223 DEBUG nova.compute.resource_tracker [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.960 187223 DEBUG oslo_concurrency.lockutils [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:36 np0005535656 nova_compute[187219]: 2025-11-25 19:02:36.965 187223 INFO nova.compute.manager [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:02:37 np0005535656 nova_compute[187219]: 2025-11-25 19:02:37.062 187223 INFO nova.scheduler.client.report [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration d94c7009-36d0-4569-a760-0ef449ec6030#033[00m
Nov 25 14:02:37 np0005535656 nova_compute[187219]: 2025-11-25 19:02:37.063 187223 DEBUG nova.virt.libvirt.driver [None req-546029e9-89ad-47ec-9f24-d33c2309801e fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:02:38 np0005535656 nova_compute[187219]: 2025-11-25 19:02:38.109 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:40 np0005535656 nova_compute[187219]: 2025-11-25 19:02:40.578 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:42 np0005535656 nova_compute[187219]: 2025-11-25 19:02:42.216 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097347.2150366, b77a1129-349d-490e-995b-f3af82cad09d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:02:42 np0005535656 nova_compute[187219]: 2025-11-25 19:02:42.216 187223 INFO nova.compute.manager [-] [instance: b77a1129-349d-490e-995b-f3af82cad09d] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:02:42 np0005535656 nova_compute[187219]: 2025-11-25 19:02:42.250 187223 DEBUG nova.compute.manager [None req-d2c1d725-0dea-408b-a1ca-7ed6825e530d - - - - - -] [instance: b77a1129-349d-490e-995b-f3af82cad09d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:02:42 np0005535656 podman[212384]: 2025-11-25 19:02:42.979075178 +0000 UTC m=+0.086945667 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 14:02:43 np0005535656 nova_compute[187219]: 2025-11-25 19:02:43.112 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:44 np0005535656 podman[212405]: 2025-11-25 19:02:44.974470488 +0000 UTC m=+0.085572379 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 25 14:02:45 np0005535656 nova_compute[187219]: 2025-11-25 19:02:45.580 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:48 np0005535656 nova_compute[187219]: 2025-11-25 19:02:48.114 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:02:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:02:50 np0005535656 nova_compute[187219]: 2025-11-25 19:02:50.582 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:51 np0005535656 nova_compute[187219]: 2025-11-25 19:02:51.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:53 np0005535656 nova_compute[187219]: 2025-11-25 19:02:53.140 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:54 np0005535656 nova_compute[187219]: 2025-11-25 19:02:54.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.584 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:02:55 np0005535656 nova_compute[187219]: 2025-11-25 19:02:55.695 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:02:57 np0005535656 podman[212426]: 2025-11-25 19:02:57.97741193 +0000 UTC m=+0.097217172 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:02:58 np0005535656 nova_compute[187219]: 2025-11-25 19:02:58.143 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:02:58 np0005535656 nova_compute[187219]: 2025-11-25 19:02:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:59.079 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:02:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:02:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:02:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:02:59 np0005535656 nova_compute[187219]: 2025-11-25 19:02:59.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:59 np0005535656 nova_compute[187219]: 2025-11-25 19:02:59.694 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:59 np0005535656 nova_compute[187219]: 2025-11-25 19:02:59.695 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:02:59 np0005535656 nova_compute[187219]: 2025-11-25 19:02:59.695 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:03:00 np0005535656 nova_compute[187219]: 2025-11-25 19:03:00.616 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.749 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.750 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.750 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.751 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.981 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.983 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5905MB free_disk=73.16390609741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.983 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:03:02 np0005535656 nova_compute[187219]: 2025-11-25 19:03:02.983 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.050 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.051 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.084 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.108 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.108 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.124 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.175 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STATUS_DISABLED,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.180 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.199 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.217 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.220 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:03:03 np0005535656 nova_compute[187219]: 2025-11-25 19:03:03.220 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:03:05 np0005535656 nova_compute[187219]: 2025-11-25 19:03:05.618 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:05 np0005535656 podman[197580]: time="2025-11-25T19:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:03:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:03:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 25 14:03:05 np0005535656 podman[212452]: 2025-11-25 19:03:05.987845022 +0000 UTC m=+0.091927315 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 25 14:03:06 np0005535656 podman[212451]: 2025-11-25 19:03:06.038384532 +0000 UTC m=+0.145556510 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 14:03:06 np0005535656 nova_compute[187219]: 2025-11-25 19:03:06.221 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:06 np0005535656 nova_compute[187219]: 2025-11-25 19:03:06.344 187223 DEBUG nova.compute.manager [None req-10512172-aa8e-4351-8092-493e9be9e5fc 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606#033[00m
Nov 25 14:03:06 np0005535656 nova_compute[187219]: 2025-11-25 19:03:06.392 187223 DEBUG nova.compute.provider_tree [None req-10512172-aa8e-4351-8092-493e9be9e5fc 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 17 to 18 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:03:07 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:07.164 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:03:07 np0005535656 nova_compute[187219]: 2025-11-25 19:03:07.165 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:07 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:07.166 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:03:07 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:07.169 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:03:08 np0005535656 nova_compute[187219]: 2025-11-25 19:03:08.182 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:08 np0005535656 nova_compute[187219]: 2025-11-25 19:03:08.983 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:10 np0005535656 nova_compute[187219]: 2025-11-25 19:03:10.619 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:13 np0005535656 nova_compute[187219]: 2025-11-25 19:03:13.184 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:13 np0005535656 podman[212496]: 2025-11-25 19:03:13.971758671 +0000 UTC m=+0.086484035 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 25 14:03:15 np0005535656 nova_compute[187219]: 2025-11-25 19:03:15.654 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:15 np0005535656 podman[212517]: 2025-11-25 19:03:15.9677882 +0000 UTC m=+0.081287531 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 25 14:03:18 np0005535656 nova_compute[187219]: 2025-11-25 19:03:18.187 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:03:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:03:20 np0005535656 nova_compute[187219]: 2025-11-25 19:03:20.656 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:23 np0005535656 nova_compute[187219]: 2025-11-25 19:03:23.193 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:25 np0005535656 nova_compute[187219]: 2025-11-25 19:03:25.658 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:28 np0005535656 nova_compute[187219]: 2025-11-25 19:03:28.196 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:28 np0005535656 podman[212536]: 2025-11-25 19:03:28.953313629 +0000 UTC m=+0.072719533 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:03:30 np0005535656 nova_compute[187219]: 2025-11-25 19:03:30.707 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:33 np0005535656 nova_compute[187219]: 2025-11-25 19:03:33.236 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:35 np0005535656 podman[197580]: time="2025-11-25T19:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:03:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:03:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 25 14:03:35 np0005535656 nova_compute[187219]: 2025-11-25 19:03:35.709 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:36 np0005535656 podman[212563]: 2025-11-25 19:03:36.975120976 +0000 UTC m=+0.088243794 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 14:03:37 np0005535656 podman[212562]: 2025-11-25 19:03:37.030340984 +0000 UTC m=+0.145382034 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 25 14:03:38 np0005535656 nova_compute[187219]: 2025-11-25 19:03:38.239 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:40 np0005535656 nova_compute[187219]: 2025-11-25 19:03:40.710 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:43 np0005535656 nova_compute[187219]: 2025-11-25 19:03:43.241 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:44 np0005535656 podman[212602]: 2025-11-25 19:03:44.96390628 +0000 UTC m=+0.081839876 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7)
Nov 25 14:03:45 np0005535656 nova_compute[187219]: 2025-11-25 19:03:45.712 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:45 np0005535656 ovn_controller[95460]: 2025-11-25T19:03:45Z|00088|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 25 14:03:46 np0005535656 podman[212624]: 2025-11-25 19:03:46.976652121 +0000 UTC m=+0.091653738 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 14:03:48 np0005535656 nova_compute[187219]: 2025-11-25 19:03:48.242 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:03:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:03:50 np0005535656 nova_compute[187219]: 2025-11-25 19:03:50.716 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:51 np0005535656 nova_compute[187219]: 2025-11-25 19:03:51.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:53 np0005535656 nova_compute[187219]: 2025-11-25 19:03:53.244 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.697 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.698 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:03:55 np0005535656 nova_compute[187219]: 2025-11-25 19:03:55.717 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:58 np0005535656 nova_compute[187219]: 2025-11-25 19:03:58.246 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:03:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:03:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:03:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:03:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:03:59 np0005535656 podman[212645]: 2025-11-25 19:03:59.964413161 +0000 UTC m=+0.078047432 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:04:00 np0005535656 nova_compute[187219]: 2025-11-25 19:04:00.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:00 np0005535656 nova_compute[187219]: 2025-11-25 19:04:00.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:00 np0005535656 nova_compute[187219]: 2025-11-25 19:04:00.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:04:00 np0005535656 nova_compute[187219]: 2025-11-25 19:04:00.719 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:01 np0005535656 nova_compute[187219]: 2025-11-25 19:04:01.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:03 np0005535656 nova_compute[187219]: 2025-11-25 19:04:03.278 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.697 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.698 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.698 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.699 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.890 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.890 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5892MB free_disk=73.16390609741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.891 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.891 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.965 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.965 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:04:04 np0005535656 nova_compute[187219]: 2025-11-25 19:04:04.994 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:04:05 np0005535656 nova_compute[187219]: 2025-11-25 19:04:05.024 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:04:05 np0005535656 nova_compute[187219]: 2025-11-25 19:04:05.025 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:04:05 np0005535656 nova_compute[187219]: 2025-11-25 19:04:05.026 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:05 np0005535656 podman[197580]: time="2025-11-25T19:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:04:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:04:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 25 14:04:05 np0005535656 nova_compute[187219]: 2025-11-25 19:04:05.721 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:07 np0005535656 podman[212671]: 2025-11-25 19:04:07.95848689 +0000 UTC m=+0.070153083 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:04:08 np0005535656 podman[212670]: 2025-11-25 19:04:08.004250366 +0000 UTC m=+0.126419568 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 25 14:04:08 np0005535656 nova_compute[187219]: 2025-11-25 19:04:08.025 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:08 np0005535656 nova_compute[187219]: 2025-11-25 19:04:08.281 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:09.175 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:04:09 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:09.176 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:04:09 np0005535656 nova_compute[187219]: 2025-11-25 19:04:09.206 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:10 np0005535656 nova_compute[187219]: 2025-11-25 19:04:10.723 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:13 np0005535656 nova_compute[187219]: 2025-11-25 19:04:13.286 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:15 np0005535656 nova_compute[187219]: 2025-11-25 19:04:15.726 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:15 np0005535656 podman[212714]: 2025-11-25 19:04:15.855858023 +0000 UTC m=+0.098115867 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 25 14:04:17 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:17.180 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:17 np0005535656 podman[212735]: 2025-11-25 19:04:17.931169735 +0000 UTC m=+0.050513408 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 14:04:18 np0005535656 nova_compute[187219]: 2025-11-25 19:04:18.288 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:04:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:04:20 np0005535656 nova_compute[187219]: 2025-11-25 19:04:20.728 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:23 np0005535656 nova_compute[187219]: 2025-11-25 19:04:23.293 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:25 np0005535656 nova_compute[187219]: 2025-11-25 19:04:25.767 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:28 np0005535656 nova_compute[187219]: 2025-11-25 19:04:28.296 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:30 np0005535656 nova_compute[187219]: 2025-11-25 19:04:30.768 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:30 np0005535656 podman[212757]: 2025-11-25 19:04:30.948648489 +0000 UTC m=+0.067741656 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:04:33 np0005535656 nova_compute[187219]: 2025-11-25 19:04:33.328 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:35 np0005535656 podman[197580]: time="2025-11-25T19:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:04:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:04:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 25 14:04:35 np0005535656 nova_compute[187219]: 2025-11-25 19:04:35.769 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.151 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.152 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.168 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.269 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.270 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.278 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.278 187223 INFO nova.compute.claims [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.333 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.395 187223 DEBUG nova.compute.provider_tree [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.410 187223 DEBUG nova.scheduler.client.report [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.440 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.441 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.502 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.502 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.524 187223 INFO nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.548 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.648 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.650 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.651 187223 INFO nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Creating image(s)#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.652 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.653 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.654 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.680 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.763 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.765 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.766 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.788 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.816 187223 DEBUG nova.policy [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.869 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.870 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.924 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.925 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:38 np0005535656 nova_compute[187219]: 2025-11-25 19:04:38.925 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:38 np0005535656 podman[212789]: 2025-11-25 19:04:38.982201661 +0000 UTC m=+0.097306684 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.001 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.002 187223 DEBUG nova.virt.disk.api [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.003 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:39 np0005535656 podman[212786]: 2025-11-25 19:04:39.010400541 +0000 UTC m=+0.121644558 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.056 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.057 187223 DEBUG nova.virt.disk.api [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.057 187223 DEBUG nova.objects.instance [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid 9503150b-9383-4483-8191-33e5f93b4550 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.077 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.077 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Ensure instance console log exists: /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.078 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.078 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.079 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:39 np0005535656 nova_compute[187219]: 2025-11-25 19:04:39.607 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Successfully created port: 30b651bd-11f7-4be4-a855-ce8a0cb28154 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.296 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Successfully updated port: 30b651bd-11f7-4be4-a855-ce8a0cb28154 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.326 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.326 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.327 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.418 187223 DEBUG nova.compute.manager [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-changed-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.419 187223 DEBUG nova.compute.manager [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Refreshing instance network info cache due to event network-changed-30b651bd-11f7-4be4-a855-ce8a0cb28154. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.420 187223 DEBUG oslo_concurrency.lockutils [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.531 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:04:40 np0005535656 nova_compute[187219]: 2025-11-25 19:04:40.772 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.335 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.775 187223 DEBUG nova.network.neutron [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updating instance_info_cache with network_info: [{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.802 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.802 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Instance network_info: |[{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.803 187223 DEBUG oslo_concurrency.lockutils [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.803 187223 DEBUG nova.network.neutron [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Refreshing network info cache for port 30b651bd-11f7-4be4-a855-ce8a0cb28154 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.807 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Start _get_guest_xml network_info=[{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.814 187223 WARNING nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.820 187223 DEBUG nova.virt.libvirt.host [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.821 187223 DEBUG nova.virt.libvirt.host [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.825 187223 DEBUG nova.virt.libvirt.host [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.826 187223 DEBUG nova.virt.libvirt.host [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.827 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.827 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.828 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.828 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.828 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.828 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.829 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.829 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.829 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.830 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.830 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.830 187223 DEBUG nova.virt.hardware [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.834 187223 DEBUG nova.virt.libvirt.vif [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:04:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1329786635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1329786635',id=12,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-zqiuwdcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:04:38Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=9503150b-9383-4483-8191-33e5f93b4550,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.835 187223 DEBUG nova.network.os_vif_util [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.835 187223 DEBUG nova.network.os_vif_util [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.836 187223 DEBUG nova.objects.instance [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9503150b-9383-4483-8191-33e5f93b4550 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.854 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <uuid>9503150b-9383-4483-8191-33e5f93b4550</uuid>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <name>instance-0000000c</name>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-1329786635</nova:name>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:04:43</nova:creationTime>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        <nova:port uuid="30b651bd-11f7-4be4-a855-ce8a0cb28154">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="serial">9503150b-9383-4483-8191-33e5f93b4550</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="uuid">9503150b-9383-4483-8191-33e5f93b4550</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.config"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:ae:82:e6"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <target dev="tap30b651bd-11"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/console.log" append="off"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:04:43 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:04:43 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:04:43 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:04:43 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.855 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Preparing to wait for external event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.856 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.857 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.857 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.858 187223 DEBUG nova.virt.libvirt.vif [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:04:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1329786635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1329786635',id=12,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-zqiuwdcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:04:38Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=9503150b-9383-4483-8191-33e5f93b4550,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.859 187223 DEBUG nova.network.os_vif_util [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.860 187223 DEBUG nova.network.os_vif_util [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.861 187223 DEBUG os_vif [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.862 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.863 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.864 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.868 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.868 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30b651bd-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.869 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap30b651bd-11, col_values=(('external_ids', {'iface-id': '30b651bd-11f7-4be4-a855-ce8a0cb28154', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:82:e6', 'vm-uuid': '9503150b-9383-4483-8191-33e5f93b4550'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:43 np0005535656 NetworkManager[55548]: <info>  [1764097483.8731] manager: (tap30b651bd-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.872 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.874 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.882 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.883 187223 INFO os_vif [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11')#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.946 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.946 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.947 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:ae:82:e6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:04:43 np0005535656 nova_compute[187219]: 2025-11-25 19:04:43.948 187223 INFO nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Using config drive#033[00m
Nov 25 14:04:45 np0005535656 nova_compute[187219]: 2025-11-25 19:04:45.776 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:45 np0005535656 nova_compute[187219]: 2025-11-25 19:04:45.799 187223 INFO nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Creating config drive at /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.config#033[00m
Nov 25 14:04:45 np0005535656 nova_compute[187219]: 2025-11-25 19:04:45.808 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9peh0gi4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:04:45 np0005535656 nova_compute[187219]: 2025-11-25 19:04:45.946 187223 DEBUG oslo_concurrency.processutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9peh0gi4" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:04:46 np0005535656 kernel: tap30b651bd-11: entered promiscuous mode
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.0323] manager: (tap30b651bd-11): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Nov 25 14:04:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:46Z|00089|binding|INFO|Claiming lport 30b651bd-11f7-4be4-a855-ce8a0cb28154 for this chassis.
Nov 25 14:04:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:46Z|00090|binding|INFO|30b651bd-11f7-4be4-a855-ce8a0cb28154: Claiming fa:16:3e:ae:82:e6 10.100.0.10
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.030 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.034 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.044 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.056 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:82:e6 10.100.0.10'], port_security=['fa:16:3e:ae:82:e6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9503150b-9383-4483-8191-33e5f93b4550', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=30b651bd-11f7-4be4-a855-ce8a0cb28154) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.057 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 30b651bd-11f7-4be4-a855-ce8a0cb28154 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:04:46 np0005535656 systemd-udevd[212867]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.059 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.072 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4084f081-56ea-4dca-8d9a-75c447c08abb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.073 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.077 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.077 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b8193902-8e11-4bd3-b16b-09eac81cf41e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.079 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9779ef-7fd4-4e62-9d7d-6f85484cb236]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.0812] device (tap30b651bd-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.0821] device (tap30b651bd-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:04:46 np0005535656 systemd-machined[153481]: New machine qemu-8-instance-0000000c.
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.094 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[e61e02ce-66e9-4143-aac2-71a56cb8d12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:46Z|00091|binding|INFO|Setting lport 30b651bd-11f7-4be4-a855-ce8a0cb28154 ovn-installed in OVS
Nov 25 14:04:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:46Z|00092|binding|INFO|Setting lport 30b651bd-11f7-4be4-a855-ce8a0cb28154 up in Southbound
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.105 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.105 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e2777f-2c59-40ac-b9fe-a220f38ac05a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 systemd[1]: Started Virtual Machine qemu-8-instance-0000000c.
Nov 25 14:04:46 np0005535656 podman[212843]: 2025-11-25 19:04:46.108816118 +0000 UTC m=+0.118508812 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal)
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.142 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd4853e-7f26-4a4b-97a3-1cab516e38c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 systemd-udevd[212876]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.149 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1c16a6-d85a-4f94-815b-0496c6623cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.1505] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.187 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[9c092e4e-0b5e-4f36-83fe-9fb47c1111aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.189 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[422173a8-da8b-4044-8fd4-a35537585ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.2119] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.218 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[59eb9601-eec4-4f44-afb5-ac9d500b6d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.234 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[77c2fb41-9d7e-41dc-8df5-b282fb8e0b38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444773, 'reachable_time': 44058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212910, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.248 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[ed835d92-e6a7-4314-ae2f-436868643a34]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444773, 'tstamp': 444773}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212911, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.264 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[59024961-9e13-4ac9-9e2d-f16e21f894fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444773, 'reachable_time': 44058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212912, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.293 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6459041f-dcff-4a98-aef1-ffa9ad5b3dd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.374 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[70b54cd9-b689-4998-9aaa-3ddd34a48729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.376 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.377 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.377 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:46 np0005535656 NetworkManager[55548]: <info>  [1764097486.4180] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 25 14:04:46 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.417 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.423 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:04:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:46Z|00093|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.424 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.425 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.426 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.427 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e28075-9468-4c7e-9da5-90070d60f9b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.428 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:04:46 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:46.429 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.438 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.455 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097486.4546196, 9503150b-9383-4483-8191-33e5f93b4550 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.456 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] VM Started (Lifecycle Event)#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.480 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.484 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097486.456408, 9503150b-9383-4483-8191-33e5f93b4550 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.485 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.510 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.515 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.550 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.617 187223 DEBUG nova.compute.manager [req-4b061a0b-a9f0-4491-8529-bcb5c39aaf1c req-abc5be0e-1243-408b-9692-a3dbc4a06e48 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.618 187223 DEBUG oslo_concurrency.lockutils [req-4b061a0b-a9f0-4491-8529-bcb5c39aaf1c req-abc5be0e-1243-408b-9692-a3dbc4a06e48 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.618 187223 DEBUG oslo_concurrency.lockutils [req-4b061a0b-a9f0-4491-8529-bcb5c39aaf1c req-abc5be0e-1243-408b-9692-a3dbc4a06e48 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.618 187223 DEBUG oslo_concurrency.lockutils [req-4b061a0b-a9f0-4491-8529-bcb5c39aaf1c req-abc5be0e-1243-408b-9692-a3dbc4a06e48 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.618 187223 DEBUG nova.compute.manager [req-4b061a0b-a9f0-4491-8529-bcb5c39aaf1c req-abc5be0e-1243-408b-9692-a3dbc4a06e48 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Processing event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.619 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.624 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097486.624615, 9503150b-9383-4483-8191-33e5f93b4550 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.625 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.628 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.633 187223 INFO nova.virt.libvirt.driver [-] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Instance spawned successfully.#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.634 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.686 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.696 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.701 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.702 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.702 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.703 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.704 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.705 187223 DEBUG nova.virt.libvirt.driver [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.741 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.796 187223 INFO nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Took 8.15 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.796 187223 DEBUG nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:04:46 np0005535656 podman[212951]: 2025-11-25 19:04:46.805150302 +0000 UTC m=+0.060344131 container create c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:04:46 np0005535656 systemd[1]: Started libpod-conmon-c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e.scope.
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.865 187223 INFO nova.compute.manager [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Took 8.64 seconds to build instance.#033[00m
Nov 25 14:04:46 np0005535656 podman[212951]: 2025-11-25 19:04:46.773205989 +0000 UTC m=+0.028419678 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.868 187223 DEBUG nova.network.neutron [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updated VIF entry in instance network info cache for port 30b651bd-11f7-4be4-a855-ce8a0cb28154. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.868 187223 DEBUG nova.network.neutron [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updating instance_info_cache with network_info: [{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:04:46 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:04:46 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f675fbf63f628106dd781d0ce2c9582b61b8547fc689bee02acd29127ebf1fd1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.887 187223 DEBUG oslo_concurrency.lockutils [req-a5c0aca7-e714-4bb5-8bde-e462eb8d2453 req-68bcf07a-17a6-4fd0-ac51-5af48ab6c89e 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:04:46 np0005535656 nova_compute[187219]: 2025-11-25 19:04:46.890 187223 DEBUG oslo_concurrency.lockutils [None req-3ba6161d-386a-43a3-9d9d-4476e2ccd521 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:46 np0005535656 podman[212951]: 2025-11-25 19:04:46.893273591 +0000 UTC m=+0.148467400 container init c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:04:46 np0005535656 podman[212951]: 2025-11-25 19:04:46.899516185 +0000 UTC m=+0.154709984 container start c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 25 14:04:46 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [NOTICE]   (212971) : New worker (212973) forked
Nov 25 14:04:46 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [NOTICE]   (212971) : Loading success.
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.720 187223 DEBUG nova.compute.manager [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.720 187223 DEBUG oslo_concurrency.lockutils [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.720 187223 DEBUG oslo_concurrency.lockutils [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.720 187223 DEBUG oslo_concurrency.lockutils [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.721 187223 DEBUG nova.compute.manager [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.721 187223 WARNING nova.compute.manager [req-3bfd52d2-ed9d-4b1d-91c0-c17db78e9a25 req-e2821e95-c26d-4358-a62e-d9886a814e87 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state None.#033[00m
Nov 25 14:04:48 np0005535656 nova_compute[187219]: 2025-11-25 19:04:48.872 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:48 np0005535656 podman[212982]: 2025-11-25 19:04:48.937167205 +0000 UTC m=+0.051688732 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:04:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:04:50 np0005535656 nova_compute[187219]: 2025-11-25 19:04:50.781 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:51 np0005535656 nova_compute[187219]: 2025-11-25 19:04:51.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:53 np0005535656 nova_compute[187219]: 2025-11-25 19:04:53.874 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:55 np0005535656 nova_compute[187219]: 2025-11-25 19:04:55.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:55 np0005535656 nova_compute[187219]: 2025-11-25 19:04:55.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:04:55 np0005535656 nova_compute[187219]: 2025-11-25 19:04:55.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:04:55 np0005535656 nova_compute[187219]: 2025-11-25 19:04:55.783 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:56 np0005535656 nova_compute[187219]: 2025-11-25 19:04:56.442 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:04:56 np0005535656 nova_compute[187219]: 2025-11-25 19:04:56.442 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:04:56 np0005535656 nova_compute[187219]: 2025-11-25 19:04:56.443 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:04:56 np0005535656 nova_compute[187219]: 2025-11-25 19:04:56.443 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9503150b-9383-4483-8191-33e5f93b4550 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.690 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updating instance_info_cache with network_info: [{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.718 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.719 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.720 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.720 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.720 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.733 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.733 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.734 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 14:04:58 np0005535656 nova_compute[187219]: 2025-11-25 19:04:58.876 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:04:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:59.080 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:04:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:59.081 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:04:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:04:59.082 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:04:59 np0005535656 nova_compute[187219]: 2025-11-25 19:04:59.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:59 np0005535656 nova_compute[187219]: 2025-11-25 19:04:59.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:04:59 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:59Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ae:82:e6 10.100.0.10
Nov 25 14:04:59 np0005535656 ovn_controller[95460]: 2025-11-25T19:04:59Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ae:82:e6 10.100.0.10
Nov 25 14:05:00 np0005535656 nova_compute[187219]: 2025-11-25 19:05:00.690 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:00 np0005535656 nova_compute[187219]: 2025-11-25 19:05:00.690 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:05:00 np0005535656 nova_compute[187219]: 2025-11-25 19:05:00.783 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:01 np0005535656 nova_compute[187219]: 2025-11-25 19:05:01.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:01 np0005535656 nova_compute[187219]: 2025-11-25 19:05:01.674 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:01 np0005535656 podman[213025]: 2025-11-25 19:05:01.991761836 +0000 UTC m=+0.089499936 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:05:03 np0005535656 nova_compute[187219]: 2025-11-25 19:05:03.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:03 np0005535656 nova_compute[187219]: 2025-11-25 19:05:03.877 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:05 np0005535656 podman[197580]: time="2025-11-25T19:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:05:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:05:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Nov 25 14:05:05 np0005535656 nova_compute[187219]: 2025-11-25 19:05:05.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:05 np0005535656 nova_compute[187219]: 2025-11-25 19:05:05.786 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.388 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.389 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.389 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.390 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.479 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.572 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.574 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.644 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.852 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.853 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.13483428955078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.854 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:06 np0005535656 nova_compute[187219]: 2025-11-25 19:05:06.854 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.012 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 9503150b-9383-4483-8191-33e5f93b4550 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.013 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.014 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.161 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.180 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.209 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:05:07 np0005535656 nova_compute[187219]: 2025-11-25 19:05:07.209 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:08 np0005535656 nova_compute[187219]: 2025-11-25 19:05:08.879 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:09 np0005535656 podman[213058]: 2025-11-25 19:05:09.96673707 +0000 UTC m=+0.075185192 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:05:09 np0005535656 podman[213057]: 2025-11-25 19:05:09.980187351 +0000 UTC m=+0.098960730 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 14:05:10 np0005535656 nova_compute[187219]: 2025-11-25 19:05:10.789 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:11 np0005535656 nova_compute[187219]: 2025-11-25 19:05:11.209 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.792 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.824 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Triggering sync for uuid 9503150b-9383-4483-8191-33e5f93b4550 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.825 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.825 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "9503150b-9383-4483-8191-33e5f93b4550" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.884 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:13 np0005535656 nova_compute[187219]: 2025-11-25 19:05:13.912 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "9503150b-9383-4483-8191-33e5f93b4550" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:14 np0005535656 nova_compute[187219]: 2025-11-25 19:05:14.202 187223 DEBUG nova.compute.manager [None req-23ee4a19-7943-403d-86f9-4d9d7ae5cb55 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610#033[00m
Nov 25 14:05:14 np0005535656 nova_compute[187219]: 2025-11-25 19:05:14.253 187223 DEBUG nova.compute.provider_tree [None req-23ee4a19-7943-403d-86f9-4d9d7ae5cb55 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 18 to 20 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:05:15 np0005535656 nova_compute[187219]: 2025-11-25 19:05:15.791 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:16 np0005535656 ovn_controller[95460]: 2025-11-25T19:05:16Z|00094|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 25 14:05:16 np0005535656 podman[213102]: 2025-11-25 19:05:16.962347987 +0000 UTC m=+0.084717218 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 14:05:18 np0005535656 nova_compute[187219]: 2025-11-25 19:05:18.886 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:05:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:05:19 np0005535656 podman[213124]: 2025-11-25 19:05:19.961478394 +0000 UTC m=+0.082322183 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:05:20 np0005535656 nova_compute[187219]: 2025-11-25 19:05:20.471 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Check if temp file /var/lib/nova/instances/tmp_0y0oe0i exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:05:20 np0005535656 nova_compute[187219]: 2025-11-25 19:05:20.471 187223 DEBUG nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_0y0oe0i',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9503150b-9383-4483-8191-33e5f93b4550',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:05:20 np0005535656 nova_compute[187219]: 2025-11-25 19:05:20.793 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:22 np0005535656 nova_compute[187219]: 2025-11-25 19:05:22.914 187223 DEBUG oslo_concurrency.processutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:05:22 np0005535656 nova_compute[187219]: 2025-11-25 19:05:22.996 187223 DEBUG oslo_concurrency.processutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:05:22 np0005535656 nova_compute[187219]: 2025-11-25 19:05:22.998 187223 DEBUG oslo_concurrency.processutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:05:23 np0005535656 nova_compute[187219]: 2025-11-25 19:05:23.049 187223 DEBUG oslo_concurrency.processutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:05:23 np0005535656 nova_compute[187219]: 2025-11-25 19:05:23.888 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:25 np0005535656 nova_compute[187219]: 2025-11-25 19:05:25.821 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:26 np0005535656 systemd-logind[788]: New session 32 of user nova.
Nov 25 14:05:26 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:05:26 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:05:26 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:05:26 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:05:26 np0005535656 systemd[213154]: Queued start job for default target Main User Target.
Nov 25 14:05:26 np0005535656 systemd[213154]: Created slice User Application Slice.
Nov 25 14:05:26 np0005535656 systemd[213154]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:05:26 np0005535656 systemd[213154]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:05:26 np0005535656 systemd[213154]: Reached target Paths.
Nov 25 14:05:26 np0005535656 systemd[213154]: Reached target Timers.
Nov 25 14:05:26 np0005535656 systemd[213154]: Starting D-Bus User Message Bus Socket...
Nov 25 14:05:26 np0005535656 systemd[213154]: Starting Create User's Volatile Files and Directories...
Nov 25 14:05:26 np0005535656 systemd[213154]: Finished Create User's Volatile Files and Directories.
Nov 25 14:05:26 np0005535656 systemd[213154]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:05:26 np0005535656 systemd[213154]: Reached target Sockets.
Nov 25 14:05:26 np0005535656 systemd[213154]: Reached target Basic System.
Nov 25 14:05:26 np0005535656 systemd[213154]: Reached target Main User Target.
Nov 25 14:05:26 np0005535656 systemd[213154]: Startup finished in 171ms.
Nov 25 14:05:26 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:05:26 np0005535656 systemd[1]: Started Session 32 of User nova.
Nov 25 14:05:26 np0005535656 systemd[1]: session-32.scope: Deactivated successfully.
Nov 25 14:05:26 np0005535656 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Nov 25 14:05:26 np0005535656 systemd-logind[788]: Removed session 32.
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.189 187223 DEBUG nova.compute.manager [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.191 187223 DEBUG oslo_concurrency.lockutils [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.191 187223 DEBUG oslo_concurrency.lockutils [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.192 187223 DEBUG oslo_concurrency.lockutils [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.192 187223 DEBUG nova.compute.manager [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.192 187223 DEBUG nova.compute.manager [req-c5484153-ad68-4c0c-9970-94e8337de09f req-fa9a126d-70e0-420d-81b9-5145771bfbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:05:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:27.793 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:05:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:27.794 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:05:27 np0005535656 nova_compute[187219]: 2025-11-25 19:05:27.795 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.254 187223 INFO nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Took 5.20 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.255 187223 DEBUG nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.276 187223 DEBUG nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_0y0oe0i',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9503150b-9383-4483-8191-33e5f93b4550',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(69ac34b7-a12f-4fc2-aea5-df4e7c575170),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.306 187223 DEBUG nova.objects.instance [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 9503150b-9383-4483-8191-33e5f93b4550 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.307 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.309 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.309 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.330 187223 DEBUG nova.virt.libvirt.vif [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:04:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1329786635',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1329786635',id=12,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:04:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-zqiuwdcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:04:46Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=9503150b-9383-4483-8191-33e5f93b4550,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.330 187223 DEBUG nova.network.os_vif_util [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.331 187223 DEBUG nova.network.os_vif_util [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.332 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:05:28 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:ae:82:e6"/>
Nov 25 14:05:28 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:05:28 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:05:28 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:05:28 np0005535656 nova_compute[187219]:  <target dev="tap30b651bd-11"/>
Nov 25 14:05:28 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:05:28 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.332 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:05:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:28.795 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.812 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.813 187223 INFO nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.870 187223 INFO nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:05:28 np0005535656 nova_compute[187219]: 2025-11-25 19:05:28.891 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.333 187223 DEBUG nova.compute.manager [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.333 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.334 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.334 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.335 187223 DEBUG nova.compute.manager [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.335 187223 WARNING nova.compute.manager [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.336 187223 DEBUG nova.compute.manager [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-changed-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.336 187223 DEBUG nova.compute.manager [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Refreshing instance network info cache due to event network-changed-30b651bd-11f7-4be4-a855-ce8a0cb28154. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.337 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.337 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.338 187223 DEBUG nova.network.neutron [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Refreshing network info cache for port 30b651bd-11f7-4be4-a855-ce8a0cb28154 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.374 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.374 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.877 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:05:29 np0005535656 nova_compute[187219]: 2025-11-25 19:05:29.878 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.381 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.382 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.826 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.885 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.886 187223 DEBUG nova.virt.libvirt.migration [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.982 187223 DEBUG nova.network.neutron [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updated VIF entry in instance network info cache for port 30b651bd-11f7-4be4-a855-ce8a0cb28154. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:05:30 np0005535656 nova_compute[187219]: 2025-11-25 19:05:30.983 187223 DEBUG nova.network.neutron [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Updating instance_info_cache with network_info: [{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.027 187223 DEBUG oslo_concurrency.lockutils [req-427b31ce-7b12-4a00-aa11-e0b93d054f77 req-9fd8c46a-c654-46bd-8ee2-dd3077861176 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-9503150b-9383-4483-8191-33e5f93b4550" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.067 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097531.0665438, 9503150b-9383-4483-8191-33e5f93b4550 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.067 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.096 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.102 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.132 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:05:31 np0005535656 kernel: tap30b651bd-11 (unregistering): left promiscuous mode
Nov 25 14:05:31 np0005535656 NetworkManager[55548]: <info>  [1764097531.2698] device (tap30b651bd-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:05:31 np0005535656 ovn_controller[95460]: 2025-11-25T19:05:31Z|00095|binding|INFO|Releasing lport 30b651bd-11f7-4be4-a855-ce8a0cb28154 from this chassis (sb_readonly=0)
Nov 25 14:05:31 np0005535656 ovn_controller[95460]: 2025-11-25T19:05:31Z|00096|binding|INFO|Setting lport 30b651bd-11f7-4be4-a855-ce8a0cb28154 down in Southbound
Nov 25 14:05:31 np0005535656 ovn_controller[95460]: 2025-11-25T19:05:31Z|00097|binding|INFO|Removing iface tap30b651bd-11 ovn-installed in OVS
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.321 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.332 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:82:e6 10.100.0.10'], port_security=['fa:16:3e:ae:82:e6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9503150b-9383-4483-8191-33e5f93b4550', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=30b651bd-11f7-4be4-a855-ce8a0cb28154) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.331 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.335 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 30b651bd-11f7-4be4-a855-ce8a0cb28154 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.337 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.340 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7a71972e-e07d-4470-a7b7-aa731df353f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.341 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:05:31 np0005535656 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 25 14:05:31 np0005535656 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Consumed 14.834s CPU time.
Nov 25 14:05:31 np0005535656 systemd-machined[153481]: Machine qemu-8-instance-0000000c terminated.
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.518 187223 DEBUG nova.virt.libvirt.guest [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.520 187223 INFO nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migration operation has completed#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.520 187223 INFO nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] _post_live_migration() is started..#033[00m
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [NOTICE]   (212971) : haproxy version is 2.8.14-c23fe91
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [NOTICE]   (212971) : path to executable is /usr/sbin/haproxy
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [WARNING]  (212971) : Exiting Master process...
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [WARNING]  (212971) : Exiting Master process...
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [ALERT]    (212971) : Current worker (212973) exited with code 143 (Terminated)
Nov 25 14:05:31 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[212967]: [WARNING]  (212971) : All workers exited. Exiting... (0)
Nov 25 14:05:31 np0005535656 systemd[1]: libpod-c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e.scope: Deactivated successfully.
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.528 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:05:31 np0005535656 conmon[212967]: conmon c50361eabc17183c6f98 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e.scope/container/memory.events
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.528 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.528 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:05:31 np0005535656 podman[213204]: 2025-11-25 19:05:31.532165993 +0000 UTC m=+0.056645053 container died c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 14:05:31 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e-userdata-shm.mount: Deactivated successfully.
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.569 187223 DEBUG nova.compute.manager [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.570 187223 DEBUG oslo_concurrency.lockutils [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.571 187223 DEBUG oslo_concurrency.lockutils [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.571 187223 DEBUG oslo_concurrency.lockutils [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.572 187223 DEBUG nova.compute.manager [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:31 np0005535656 systemd[1]: var-lib-containers-storage-overlay-f675fbf63f628106dd781d0ce2c9582b61b8547fc689bee02acd29127ebf1fd1-merged.mount: Deactivated successfully.
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.572 187223 DEBUG nova.compute.manager [req-6eff76f4-daef-4aa6-9b9a-eb64f9f50637 req-c364477b-112b-4569-9840-781374afbeb1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:05:31 np0005535656 podman[213204]: 2025-11-25 19:05:31.588925516 +0000 UTC m=+0.113404606 container cleanup c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:05:31 np0005535656 systemd[1]: libpod-conmon-c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e.scope: Deactivated successfully.
Nov 25 14:05:31 np0005535656 podman[213246]: 2025-11-25 19:05:31.692718983 +0000 UTC m=+0.072842826 container remove c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.699 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[043caed7-63fd-4613-bfec-896cccf51e34]: (4, ('Tue Nov 25 07:05:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e)\nc50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e\nTue Nov 25 07:05:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (c50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e)\nc50361eabc17183c6f98550b5368f9a4252bc20cf0f2a4b0b4027e5be7eb443e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.702 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[98a14f45-928b-41cb-964d-9c90b28f1c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.704 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.707 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:31 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:05:31 np0005535656 nova_compute[187219]: 2025-11-25 19:05:31.736 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.741 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[132d82a4-29d1-4d15-815e-4668abc08c5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.759 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9d0ff6ee-b766-4018-8988-82edbe57d117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.762 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2c71a4a8-ed2f-4a9c-87f8-51ae8ac4e1c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.787 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5c878bd1-1502-4a93-993f-e0c6b87c0661]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444766, 'reachable_time': 30134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213265, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:31 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.794 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:05:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:31.794 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[05cadcc6-29e6-40c1-af3b-0101e8ee1a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.323 187223 DEBUG nova.network.neutron [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 30b651bd-11f7-4be4-a855-ce8a0cb28154 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.324 187223 DEBUG nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.325 187223 DEBUG nova.virt.libvirt.vif [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:04:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1329786635',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1329786635',id=12,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:04:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-zqiuwdcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:05:17Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=9503150b-9383-4483-8191-33e5f93b4550,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.325 187223 DEBUG nova.network.os_vif_util [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "address": "fa:16:3e:ae:82:e6", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b651bd-11", "ovs_interfaceid": "30b651bd-11f7-4be4-a855-ce8a0cb28154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.326 187223 DEBUG nova.network.os_vif_util [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.327 187223 DEBUG os_vif [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.329 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.330 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30b651bd-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.332 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.334 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.337 187223 INFO os_vif [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:82:e6,bridge_name='br-int',has_traffic_filtering=True,id=30b651bd-11f7-4be4-a855-ce8a0cb28154,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b651bd-11')#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.338 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.339 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.339 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.339 187223 DEBUG nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.340 187223 INFO nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Deleting instance files /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550_del#033[00m
Nov 25 14:05:32 np0005535656 nova_compute[187219]: 2025-11-25 19:05:32.341 187223 INFO nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Deletion of /var/lib/nova/instances/9503150b-9383-4483-8191-33e5f93b4550_del complete#033[00m
Nov 25 14:05:32 np0005535656 podman[213266]: 2025-11-25 19:05:32.971544027 +0000 UTC m=+0.083799601 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.694 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.694 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.694 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.695 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.695 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.695 187223 WARNING nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.696 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.696 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.697 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.697 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.698 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.698 187223 WARNING nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.698 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.699 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.699 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.699 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.699 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.700 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-unplugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.700 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.701 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.701 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.701 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.702 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.702 187223 WARNING nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.703 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.703 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.704 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.704 187223 DEBUG oslo_concurrency.lockutils [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.704 187223 DEBUG nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] No waiting events found dispatching network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:05:33 np0005535656 nova_compute[187219]: 2025-11-25 19:05:33.705 187223 WARNING nova.compute.manager [req-98663d12-9fef-49c7-b269-6c788ab7bce6 req-eedeb890-4e1b-43af-89e8-f8bf1a9426dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Received unexpected event network-vif-plugged-30b651bd-11f7-4be4-a855-ce8a0cb28154 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:05:35 np0005535656 podman[197580]: time="2025-11-25T19:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:05:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:05:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 25 14:05:35 np0005535656 nova_compute[187219]: 2025-11-25 19:05:35.826 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:36 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:05:36 np0005535656 systemd[213154]: Activating special unit Exit the Session...
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped target Main User Target.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped target Basic System.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped target Paths.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped target Sockets.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped target Timers.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:05:36 np0005535656 systemd[213154]: Closed D-Bus User Message Bus Socket.
Nov 25 14:05:36 np0005535656 systemd[213154]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:05:36 np0005535656 systemd[213154]: Removed slice User Application Slice.
Nov 25 14:05:36 np0005535656 systemd[213154]: Reached target Shutdown.
Nov 25 14:05:36 np0005535656 systemd[213154]: Finished Exit the Session.
Nov 25 14:05:36 np0005535656 systemd[213154]: Reached target Exit the Session.
Nov 25 14:05:36 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:05:36 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:05:36 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:05:36 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:05:36 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:05:36 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:05:36 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.814 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "9503150b-9383-4483-8191-33e5f93b4550-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.815 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.816 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "9503150b-9383-4483-8191-33e5f93b4550-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.839 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.840 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.841 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:36 np0005535656 nova_compute[187219]: 2025-11-25 19:05:36.841 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.067 187223 WARNING nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.068 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5893MB free_disk=73.16383743286133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.068 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.069 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.121 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance 9503150b-9383-4483-8191-33e5f93b4550 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.144 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.195 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration 69ac34b7-a12f-4fc2-aea5-df4e7c575170 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.195 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.196 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.298 187223 DEBUG nova.compute.provider_tree [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.330 187223 DEBUG nova.scheduler.client.report [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.335 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.373 187223 DEBUG nova.compute.resource_tracker [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.373 187223 DEBUG oslo_concurrency.lockutils [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.382 187223 INFO nova.compute.manager [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.564 187223 INFO nova.scheduler.client.report [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 69ac34b7-a12f-4fc2-aea5-df4e7c575170#033[00m
Nov 25 14:05:37 np0005535656 nova_compute[187219]: 2025-11-25 19:05:37.565 187223 DEBUG nova.virt.libvirt.driver [None req-b4ac5ae2-d8ce-476b-a485-285aa4bd85dc fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:05:40 np0005535656 nova_compute[187219]: 2025-11-25 19:05:40.866 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:40 np0005535656 podman[213293]: 2025-11-25 19:05:40.960998086 +0000 UTC m=+0.065090998 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 14:05:41 np0005535656 podman[213292]: 2025-11-25 19:05:41.030173643 +0000 UTC m=+0.127170534 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 14:05:42 np0005535656 nova_compute[187219]: 2025-11-25 19:05:42.337 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:45 np0005535656 nova_compute[187219]: 2025-11-25 19:05:45.868 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:46 np0005535656 nova_compute[187219]: 2025-11-25 19:05:46.528 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097531.5172975, 9503150b-9383-4483-8191-33e5f93b4550 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:05:46 np0005535656 nova_compute[187219]: 2025-11-25 19:05:46.529 187223 INFO nova.compute.manager [-] [instance: 9503150b-9383-4483-8191-33e5f93b4550] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:05:46 np0005535656 nova_compute[187219]: 2025-11-25 19:05:46.560 187223 DEBUG nova.compute.manager [None req-c978ed2b-25df-4ff7-bc39-1f6232ea08ad - - - - - -] [instance: 9503150b-9383-4483-8191-33e5f93b4550] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:05:47 np0005535656 nova_compute[187219]: 2025-11-25 19:05:47.340 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:47 np0005535656 podman[213337]: 2025-11-25 19:05:47.957777934 +0000 UTC m=+0.066512856 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, release=1755695350)
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:05:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:05:50 np0005535656 nova_compute[187219]: 2025-11-25 19:05:50.917 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:50 np0005535656 podman[213359]: 2025-11-25 19:05:50.977638011 +0000 UTC m=+0.093352317 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:05:52 np0005535656 nova_compute[187219]: 2025-11-25 19:05:52.342 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:53 np0005535656 nova_compute[187219]: 2025-11-25 19:05:53.706 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:55 np0005535656 nova_compute[187219]: 2025-11-25 19:05:55.921 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:56 np0005535656 nova_compute[187219]: 2025-11-25 19:05:56.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:56 np0005535656 nova_compute[187219]: 2025-11-25 19:05:56.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:05:56 np0005535656 nova_compute[187219]: 2025-11-25 19:05:56.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:05:56 np0005535656 nova_compute[187219]: 2025-11-25 19:05:56.703 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:05:57 np0005535656 nova_compute[187219]: 2025-11-25 19:05:57.345 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:05:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:59.081 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:05:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:59.082 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:05:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:05:59.082 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:05:59 np0005535656 nova_compute[187219]: 2025-11-25 19:05:59.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:05:59 np0005535656 nova_compute[187219]: 2025-11-25 19:05:59.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:00 np0005535656 nova_compute[187219]: 2025-11-25 19:06:00.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:00 np0005535656 nova_compute[187219]: 2025-11-25 19:06:00.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:06:00 np0005535656 nova_compute[187219]: 2025-11-25 19:06:00.925 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:02 np0005535656 nova_compute[187219]: 2025-11-25 19:06:02.347 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:03 np0005535656 nova_compute[187219]: 2025-11-25 19:06:03.447 187223 DEBUG nova.compute.manager [None req-707a9702-1d4d-43c4-8933-90eb0b37c79d 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606#033[00m
Nov 25 14:06:03 np0005535656 nova_compute[187219]: 2025-11-25 19:06:03.514 187223 DEBUG nova.compute.provider_tree [None req-707a9702-1d4d-43c4-8933-90eb0b37c79d 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 20 to 23 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:06:03 np0005535656 nova_compute[187219]: 2025-11-25 19:06:03.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:03 np0005535656 nova_compute[187219]: 2025-11-25 19:06:03.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:03 np0005535656 podman[213380]: 2025-11-25 19:06:03.964285014 +0000 UTC m=+0.073049723 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:06:05 np0005535656 podman[197580]: time="2025-11-25T19:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:06:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:06:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.701 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.701 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.701 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.702 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.927 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.964 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.966 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5894MB free_disk=73.16385650634766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.967 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:05 np0005535656 nova_compute[187219]: 2025-11-25 19:06:05.967 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.043 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.044 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.075 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.099 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.102 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:06:06 np0005535656 nova_compute[187219]: 2025-11-25 19:06:06.103 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:07 np0005535656 nova_compute[187219]: 2025-11-25 19:06:07.349 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:10 np0005535656 nova_compute[187219]: 2025-11-25 19:06:10.105 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:10 np0005535656 nova_compute[187219]: 2025-11-25 19:06:10.929 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:11 np0005535656 podman[213405]: 2025-11-25 19:06:11.973739362 +0000 UTC m=+0.081179740 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:06:12 np0005535656 podman[213404]: 2025-11-25 19:06:12.001219479 +0000 UTC m=+0.115261215 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 25 14:06:12 np0005535656 nova_compute[187219]: 2025-11-25 19:06:12.352 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:15 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:15Z|00098|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 25 14:06:15 np0005535656 nova_compute[187219]: 2025-11-25 19:06:15.934 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.354 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.730 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.730 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.749 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.832 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.833 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.841 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.841 187223 INFO nova.compute.claims [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:06:17 np0005535656 nova_compute[187219]: 2025-11-25 19:06:17.992 187223 DEBUG nova.compute.provider_tree [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.016 187223 DEBUG nova.scheduler.client.report [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.056 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.057 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.125 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.125 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.145 187223 INFO nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.168 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.292 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.294 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.295 187223 INFO nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Creating image(s)#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.296 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.297 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.298 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.329 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.419 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.421 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.422 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.447 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.529 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.531 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.586 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.588 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.590 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.677 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.679 187223 DEBUG nova.virt.disk.api [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.680 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.736 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.738 187223 DEBUG nova.virt.disk.api [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.739 187223 DEBUG nova.objects.instance [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid f3eec114-ab49-46a4-93df-5f391c88194f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:06:18 np0005535656 nova_compute[187219]: 2025-11-25 19:06:18.854 187223 DEBUG nova.policy [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:06:18 np0005535656 podman[213466]: 2025-11-25 19:06:18.951301234 +0000 UTC m=+0.066882226 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public)
Nov 25 14:06:19 np0005535656 nova_compute[187219]: 2025-11-25 19:06:19.121 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:06:19 np0005535656 nova_compute[187219]: 2025-11-25 19:06:19.122 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Ensure instance console log exists: /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:06:19 np0005535656 nova_compute[187219]: 2025-11-25 19:06:19.123 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:19 np0005535656 nova_compute[187219]: 2025-11-25 19:06:19.123 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:19 np0005535656 nova_compute[187219]: 2025-11-25 19:06:19.124 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:06:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.060 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Successfully created port: d57a8988-8300-46ea-a464-f60cefb0a63c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.829 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Successfully updated port: d57a8988-8300-46ea-a464-f60cefb0a63c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.854 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.855 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.855 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.935 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.965 187223 DEBUG nova.compute.manager [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-changed-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.965 187223 DEBUG nova.compute.manager [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Refreshing instance network info cache due to event network-changed-d57a8988-8300-46ea-a464-f60cefb0a63c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:06:20 np0005535656 nova_compute[187219]: 2025-11-25 19:06:20.966 187223 DEBUG oslo_concurrency.lockutils [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:06:21 np0005535656 nova_compute[187219]: 2025-11-25 19:06:21.045 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:06:21 np0005535656 podman[213489]: 2025-11-25 19:06:21.972531338 +0000 UTC m=+0.085010614 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:06:22 np0005535656 nova_compute[187219]: 2025-11-25 19:06:22.162 187223 DEBUG nova.network.neutron [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updating instance_info_cache with network_info: [{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:06:22 np0005535656 nova_compute[187219]: 2025-11-25 19:06:22.356 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.004 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.004 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Instance network_info: |[{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.005 187223 DEBUG oslo_concurrency.lockutils [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.005 187223 DEBUG nova.network.neutron [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Refreshing network info cache for port d57a8988-8300-46ea-a464-f60cefb0a63c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.008 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Start _get_guest_xml network_info=[{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.013 187223 WARNING nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.017 187223 DEBUG nova.virt.libvirt.host [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.018 187223 DEBUG nova.virt.libvirt.host [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.022 187223 DEBUG nova.virt.libvirt.host [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.023 187223 DEBUG nova.virt.libvirt.host [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.024 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.024 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.025 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.025 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.025 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.025 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.025 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.026 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.026 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.026 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.026 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.027 187223 DEBUG nova.virt.hardware [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.030 187223 DEBUG nova.virt.libvirt.vif [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:06:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-214460348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-214460348',id=13,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-d8oi41kt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:06:18Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=f3eec114-ab49-46a4-93df-5f391c88194f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.031 187223 DEBUG nova.network.os_vif_util [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.031 187223 DEBUG nova.network.os_vif_util [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.032 187223 DEBUG nova.objects.instance [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3eec114-ab49-46a4-93df-5f391c88194f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.056 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <uuid>f3eec114-ab49-46a4-93df-5f391c88194f</uuid>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <name>instance-0000000d</name>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-214460348</nova:name>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:06:23</nova:creationTime>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        <nova:port uuid="d57a8988-8300-46ea-a464-f60cefb0a63c">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="serial">f3eec114-ab49-46a4-93df-5f391c88194f</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="uuid">f3eec114-ab49-46a4-93df-5f391c88194f</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.config"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:05:5b:ca"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <target dev="tapd57a8988-83"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/console.log" append="off"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:06:23 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:06:23 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:06:23 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:06:23 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.058 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Preparing to wait for external event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.058 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.058 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.058 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.059 187223 DEBUG nova.virt.libvirt.vif [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:06:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-214460348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-214460348',id=13,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-d8oi41kt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:06:18Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=f3eec114-ab49-46a4-93df-5f391c88194f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.059 187223 DEBUG nova.network.os_vif_util [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.059 187223 DEBUG nova.network.os_vif_util [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.060 187223 DEBUG os_vif [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.060 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.060 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.061 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.062 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.062 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd57a8988-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.063 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd57a8988-83, col_values=(('external_ids', {'iface-id': 'd57a8988-8300-46ea-a464-f60cefb0a63c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:5b:ca', 'vm-uuid': 'f3eec114-ab49-46a4-93df-5f391c88194f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.064 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:23 np0005535656 NetworkManager[55548]: <info>  [1764097583.0654] manager: (tapd57a8988-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.067 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.071 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.071 187223 INFO os_vif [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83')#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.133 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.133 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.133 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:05:5b:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:06:23 np0005535656 nova_compute[187219]: 2025-11-25 19:06:23.134 187223 INFO nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Using config drive#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.028 187223 INFO nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Creating config drive at /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.config#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.037 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu5gmleu8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.179 187223 DEBUG oslo_concurrency.processutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu5gmleu8" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:06:24 np0005535656 kernel: tapd57a8988-83: entered promiscuous mode
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.2663] manager: (tapd57a8988-83): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Nov 25 14:06:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:24Z|00099|binding|INFO|Claiming lport d57a8988-8300-46ea-a464-f60cefb0a63c for this chassis.
Nov 25 14:06:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:24Z|00100|binding|INFO|d57a8988-8300-46ea-a464-f60cefb0a63c: Claiming fa:16:3e:05:5b:ca 10.100.0.12
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.268 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.279 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:5b:ca 10.100.0.12'], port_security=['fa:16:3e:05:5b:ca 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f3eec114-ab49-46a4-93df-5f391c88194f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=d57a8988-8300-46ea-a464-f60cefb0a63c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.281 104346 INFO neutron.agent.ovn.metadata.agent [-] Port d57a8988-8300-46ea-a464-f60cefb0a63c in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.284 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:06:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:24Z|00101|binding|INFO|Setting lport d57a8988-8300-46ea-a464-f60cefb0a63c ovn-installed in OVS
Nov 25 14:06:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:24Z|00102|binding|INFO|Setting lport d57a8988-8300-46ea-a464-f60cefb0a63c up in Southbound
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.298 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.302 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.303 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7e3749-31d6-4310-8acc-069a56a1a45b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.305 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.307 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.308 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e7fe41-7e36-4838-a2f6-8ba0b084f093]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.309 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6d2b6f-82dc-4143-809e-cf36be0ae22e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 systemd-udevd[213530]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.329 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[0e34321a-ed7b-4738-866f-66f8c4138267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 systemd-machined[153481]: New machine qemu-9-instance-0000000d.
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.3470] device (tapd57a8988-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.3482] device (tapd57a8988-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:06:24 np0005535656 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.360 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7057e241-ed7e-42b8-9feb-52477a31dabd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.407 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[79b7e776-211a-43f8-b044-f0ada9684877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.416 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb65151-357c-45ba-a337-dcc1aee91f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.4180] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.465 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[923b6887-6752-4365-aeb1-01f5d87078da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.470 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[faaff61d-f501-4c6c-86cb-09866ff3ffdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.5057] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.514 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b52be7e8-6211-4ad4-98c4-5255cfe07cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.538 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e05a071b-ac31-430e-8047-2cd781607a1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454603, 'reachable_time': 35436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213562, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.564 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[efea81df-ffde-4e01-ad0f-df21683df9d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454603, 'tstamp': 454603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213563, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.594 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2073c6-419d-4074-b04f-9c5f65a06781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454603, 'reachable_time': 35436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213566, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.635 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[48ba0231-01bf-4df5-bb0f-e3f197f38d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.672 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097584.6713095, f3eec114-ab49-46a4-93df-5f391c88194f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.672 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] VM Started (Lifecycle Event)#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.709 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.714 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb02267-83a6-4e16-8fc1-500c0a318550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.715 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097584.6716979, f3eec114-ab49-46a4-93df-5f391c88194f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.715 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.715 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.716 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.717 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:24 np0005535656 NetworkManager[55548]: <info>  [1764097584.7691] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.768 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:06:24 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.770 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.774 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:24Z|00103|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.777 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.779 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.780 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[64596a50-0cd3-4d24-ac58-aa67116e2a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.781 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.782 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:06:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:24.783 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.801 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:24 np0005535656 nova_compute[187219]: 2025-11-25 19:06:24.809 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.002 187223 DEBUG nova.compute.manager [req-ef74e238-4893-454e-806e-b41c6e2f6cde req-ad57e0e8-c020-4be1-af8e-9769f52a0178 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.003 187223 DEBUG oslo_concurrency.lockutils [req-ef74e238-4893-454e-806e-b41c6e2f6cde req-ad57e0e8-c020-4be1-af8e-9769f52a0178 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.004 187223 DEBUG oslo_concurrency.lockutils [req-ef74e238-4893-454e-806e-b41c6e2f6cde req-ad57e0e8-c020-4be1-af8e-9769f52a0178 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.005 187223 DEBUG oslo_concurrency.lockutils [req-ef74e238-4893-454e-806e-b41c6e2f6cde req-ad57e0e8-c020-4be1-af8e-9769f52a0178 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.005 187223 DEBUG nova.compute.manager [req-ef74e238-4893-454e-806e-b41c6e2f6cde req-ad57e0e8-c020-4be1-af8e-9769f52a0178 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Processing event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.007 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.029 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097585.022643, f3eec114-ab49-46a4-93df-5f391c88194f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.031 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.035 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.050 187223 INFO nova.virt.libvirt.driver [-] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Instance spawned successfully.#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.051 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.065 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.068 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.075 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.075 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.076 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.076 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.077 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.077 187223 DEBUG nova.virt.libvirt.driver [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.094 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.163 187223 INFO nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Took 6.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.164 187223 DEBUG nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.225 187223 INFO nova.compute.manager [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Took 7.42 seconds to build instance.#033[00m
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.245 187223 DEBUG oslo_concurrency.lockutils [None req-a87c9e8e-214c-4799-8a01-66165082f608 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:25 np0005535656 podman[213602]: 2025-11-25 19:06:25.279674748 +0000 UTC m=+0.080412600 container create b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 25 14:06:25 np0005535656 podman[213602]: 2025-11-25 19:06:25.241718789 +0000 UTC m=+0.042456691 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:06:25 np0005535656 systemd[1]: Started libpod-conmon-b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71.scope.
Nov 25 14:06:25 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:06:25 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbc8bf84e6187c1b568c9ccb9e21ba7ae16eb67364ec0d18f5860823b2e87a2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:06:25 np0005535656 podman[213602]: 2025-11-25 19:06:25.40079942 +0000 UTC m=+0.201537302 container init b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 25 14:06:25 np0005535656 podman[213602]: 2025-11-25 19:06:25.412304599 +0000 UTC m=+0.213042461 container start b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 14:06:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [NOTICE]   (213619) : New worker (213621) forked
Nov 25 14:06:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [NOTICE]   (213619) : Loading success.
Nov 25 14:06:25 np0005535656 nova_compute[187219]: 2025-11-25 19:06:25.939 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:26 np0005535656 nova_compute[187219]: 2025-11-25 19:06:26.037 187223 DEBUG nova.network.neutron [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updated VIF entry in instance network info cache for port d57a8988-8300-46ea-a464-f60cefb0a63c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:06:26 np0005535656 nova_compute[187219]: 2025-11-25 19:06:26.039 187223 DEBUG nova.network.neutron [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updating instance_info_cache with network_info: [{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:06:26 np0005535656 nova_compute[187219]: 2025-11-25 19:06:26.081 187223 DEBUG oslo_concurrency.lockutils [req-5609e1ad-dd28-4b94-8534-b562fec3b0a7 req-85684f72-61a2-45dc-b9a0-1b72770ffbdf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.020 187223 DEBUG nova.compute.manager [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.020 187223 DEBUG oslo_concurrency.lockutils [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.021 187223 DEBUG oslo_concurrency.lockutils [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.021 187223 DEBUG oslo_concurrency.lockutils [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.022 187223 DEBUG nova.compute.manager [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.022 187223 WARNING nova.compute.manager [req-477b8360-c20d-4da3-8977-d0eda197de4d req-590294dd-0aa3-447b-a95d-da2d893e925f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state None.#033[00m
Nov 25 14:06:28 np0005535656 nova_compute[187219]: 2025-11-25 19:06:28.067 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:30 np0005535656 nova_compute[187219]: 2025-11-25 19:06:30.941 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:33 np0005535656 nova_compute[187219]: 2025-11-25 19:06:33.071 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:34 np0005535656 podman[213630]: 2025-11-25 19:06:34.961480764 +0000 UTC m=+0.078462348 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:06:35 np0005535656 podman[197580]: time="2025-11-25T19:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:06:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 25 14:06:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Nov 25 14:06:35 np0005535656 nova_compute[187219]: 2025-11-25 19:06:35.943 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:36.555 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:06:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:36.556 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:06:36 np0005535656 nova_compute[187219]: 2025-11-25 19:06:36.587 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:37.559 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:06:38 np0005535656 nova_compute[187219]: 2025-11-25 19:06:38.074 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:38 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:38Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:5b:ca 10.100.0.12
Nov 25 14:06:38 np0005535656 ovn_controller[95460]: 2025-11-25T19:06:38Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:5b:ca 10.100.0.12
Nov 25 14:06:40 np0005535656 nova_compute[187219]: 2025-11-25 19:06:40.945 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:42 np0005535656 podman[213677]: 2025-11-25 19:06:42.990958698 +0000 UTC m=+0.103334036 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 25 14:06:43 np0005535656 podman[213676]: 2025-11-25 19:06:43.007480821 +0000 UTC m=+0.122693815 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 14:06:43 np0005535656 nova_compute[187219]: 2025-11-25 19:06:43.076 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:45 np0005535656 nova_compute[187219]: 2025-11-25 19:06:45.948 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:48 np0005535656 nova_compute[187219]: 2025-11-25 19:06:48.124 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:06:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:06:49 np0005535656 podman[213721]: 2025-11-25 19:06:49.952830229 +0000 UTC m=+0.067851183 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 25 14:06:50 np0005535656 nova_compute[187219]: 2025-11-25 19:06:50.950 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:52 np0005535656 podman[213742]: 2025-11-25 19:06:52.960927989 +0000 UTC m=+0.077422729 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 14:06:53 np0005535656 nova_compute[187219]: 2025-11-25 19:06:53.128 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:54 np0005535656 nova_compute[187219]: 2025-11-25 19:06:54.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:55 np0005535656 nova_compute[187219]: 2025-11-25 19:06:55.953 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.132 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.913 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.914 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.915 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:06:58 np0005535656 nova_compute[187219]: 2025-11-25 19:06:58.915 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3eec114-ab49-46a4-93df-5f391c88194f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:06:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:59.083 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:06:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:59.084 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:06:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:06:59.084 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:00 np0005535656 nova_compute[187219]: 2025-11-25 19:07:00.954 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.007 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updating instance_info_cache with network_info: [{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.023 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.024 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.025 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:02 np0005535656 nova_compute[187219]: 2025-11-25 19:07:02.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:07:03 np0005535656 nova_compute[187219]: 2025-11-25 19:07:03.134 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:03 np0005535656 nova_compute[187219]: 2025-11-25 19:07:03.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:04 np0005535656 nova_compute[187219]: 2025-11-25 19:07:04.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:05 np0005535656 podman[197580]: time="2025-11-25T19:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:07:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 25 14:07:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.692 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.877 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.878 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.878 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.878 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.939 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:07:05 np0005535656 podman[213763]: 2025-11-25 19:07:05.953268467 +0000 UTC m=+0.071161682 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:07:05 np0005535656 nova_compute[187219]: 2025-11-25 19:07:05.966 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.031 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.032 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.122 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.339 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.341 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5707MB free_disk=73.1346206665039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.342 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.342 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.420 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance f3eec114-ab49-46a4-93df-5f391c88194f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.421 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.421 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.466 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.485 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.512 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:07:06 np0005535656 nova_compute[187219]: 2025-11-25 19:07:06.513 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:06 np0005535656 ovn_controller[95460]: 2025-11-25T19:07:06Z|00104|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 14:07:07 np0005535656 nova_compute[187219]: 2025-11-25 19:07:07.979 187223 DEBUG nova.compute.manager [None req-b884251f-070b-41c2-b55d-3b516644cae5 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610#033[00m
Nov 25 14:07:08 np0005535656 nova_compute[187219]: 2025-11-25 19:07:08.038 187223 DEBUG nova.compute.provider_tree [None req-b884251f-070b-41c2-b55d-3b516644cae5 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 23 to 25 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:07:08 np0005535656 nova_compute[187219]: 2025-11-25 19:07:08.136 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:10 np0005535656 nova_compute[187219]: 2025-11-25 19:07:10.493 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:10 np0005535656 nova_compute[187219]: 2025-11-25 19:07:10.996 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:13 np0005535656 nova_compute[187219]: 2025-11-25 19:07:13.141 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:13 np0005535656 nova_compute[187219]: 2025-11-25 19:07:13.768 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Check if temp file /var/lib/nova/instances/tmp5660i4ja exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:07:13 np0005535656 nova_compute[187219]: 2025-11-25 19:07:13.769 187223 DEBUG nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5660i4ja',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f3eec114-ab49-46a4-93df-5f391c88194f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:07:13 np0005535656 podman[213795]: 2025-11-25 19:07:13.895407095 +0000 UTC m=+0.082938798 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 25 14:07:13 np0005535656 podman[213794]: 2025-11-25 19:07:13.945057917 +0000 UTC m=+0.138354215 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 25 14:07:15 np0005535656 nova_compute[187219]: 2025-11-25 19:07:15.997 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:16 np0005535656 nova_compute[187219]: 2025-11-25 19:07:16.608 187223 DEBUG oslo_concurrency.processutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:07:16 np0005535656 nova_compute[187219]: 2025-11-25 19:07:16.704 187223 DEBUG oslo_concurrency.processutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:07:16 np0005535656 nova_compute[187219]: 2025-11-25 19:07:16.705 187223 DEBUG oslo_concurrency.processutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:07:16 np0005535656 nova_compute[187219]: 2025-11-25 19:07:16.796 187223 DEBUG oslo_concurrency.processutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:07:18 np0005535656 nova_compute[187219]: 2025-11-25 19:07:18.145 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:07:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:07:20 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:07:20 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:07:20 np0005535656 systemd-logind[788]: New session 34 of user nova.
Nov 25 14:07:20 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:07:20 np0005535656 podman[213847]: 2025-11-25 19:07:20.692418459 +0000 UTC m=+0.083831121 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 14:07:20 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:07:20 np0005535656 systemd[213870]: Queued start job for default target Main User Target.
Nov 25 14:07:20 np0005535656 systemd[213870]: Created slice User Application Slice.
Nov 25 14:07:20 np0005535656 systemd[213870]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:07:20 np0005535656 systemd[213870]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:07:20 np0005535656 systemd[213870]: Reached target Paths.
Nov 25 14:07:20 np0005535656 systemd[213870]: Reached target Timers.
Nov 25 14:07:20 np0005535656 systemd[213870]: Starting D-Bus User Message Bus Socket...
Nov 25 14:07:20 np0005535656 systemd[213870]: Starting Create User's Volatile Files and Directories...
Nov 25 14:07:20 np0005535656 systemd[213870]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:07:20 np0005535656 systemd[213870]: Reached target Sockets.
Nov 25 14:07:20 np0005535656 systemd[213870]: Finished Create User's Volatile Files and Directories.
Nov 25 14:07:20 np0005535656 systemd[213870]: Reached target Basic System.
Nov 25 14:07:20 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:07:20 np0005535656 systemd[213870]: Reached target Main User Target.
Nov 25 14:07:20 np0005535656 systemd[213870]: Startup finished in 142ms.
Nov 25 14:07:20 np0005535656 systemd[1]: Started Session 34 of User nova.
Nov 25 14:07:20 np0005535656 systemd[1]: session-34.scope: Deactivated successfully.
Nov 25 14:07:20 np0005535656 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Nov 25 14:07:20 np0005535656 systemd-logind[788]: Removed session 34.
Nov 25 14:07:21 np0005535656 nova_compute[187219]: 2025-11-25 19:07:20.999 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:23.108 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.109 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:23.110 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.147 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.160 187223 DEBUG nova.compute.manager [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.161 187223 DEBUG oslo_concurrency.lockutils [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.161 187223 DEBUG oslo_concurrency.lockutils [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.162 187223 DEBUG oslo_concurrency.lockutils [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.162 187223 DEBUG nova.compute.manager [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.162 187223 DEBUG nova.compute.manager [req-224afae4-1cb9-4dc8-9177-9c2c24623fe3 req-5a8acdf6-1794-447a-be68-fcd5ce430a70 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.437 187223 INFO nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Took 6.64 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.437 187223 DEBUG nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.452 187223 DEBUG nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5660i4ja',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f3eec114-ab49-46a4-93df-5f391c88194f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(92313ac9-168a-4e41-9d0c-46a3f8ac700b),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.472 187223 DEBUG nova.objects.instance [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid f3eec114-ab49-46a4-93df-5f391c88194f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.473 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.474 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.474 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.489 187223 DEBUG nova.virt.libvirt.vif [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:06:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-214460348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-214460348',id=13,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:06:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-d8oi41kt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:06:25Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=f3eec114-ab49-46a4-93df-5f391c88194f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.489 187223 DEBUG nova.network.os_vif_util [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.490 187223 DEBUG nova.network.os_vif_util [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.490 187223 DEBUG nova.virt.libvirt.migration [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:07:23 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:05:5b:ca"/>
Nov 25 14:07:23 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:07:23 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:07:23 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:07:23 np0005535656 nova_compute[187219]:  <target dev="tapd57a8988-83"/>
Nov 25 14:07:23 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:07:23 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.491 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.977 187223 DEBUG nova.virt.libvirt.migration [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:07:23 np0005535656 nova_compute[187219]: 2025-11-25 19:07:23.978 187223 INFO nova.virt.libvirt.migration [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:07:23 np0005535656 podman[213888]: 2025-11-25 19:07:23.990945208 +0000 UTC m=+0.097783187 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.087 187223 INFO nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:07:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:24.113 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.592 187223 DEBUG nova.virt.libvirt.migration [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.592 187223 DEBUG nova.virt.libvirt.migration [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.844 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097644.8438933, f3eec114-ab49-46a4-93df-5f391c88194f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.844 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.871 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.876 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:07:24 np0005535656 nova_compute[187219]: 2025-11-25 19:07:24.908 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:07:24 np0005535656 kernel: tapd57a8988-83 (unregistering): left promiscuous mode
Nov 25 14:07:24 np0005535656 NetworkManager[55548]: <info>  [1764097644.9991] device (tapd57a8988-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:07:25 np0005535656 ovn_controller[95460]: 2025-11-25T19:07:25Z|00105|binding|INFO|Releasing lport d57a8988-8300-46ea-a464-f60cefb0a63c from this chassis (sb_readonly=0)
Nov 25 14:07:25 np0005535656 ovn_controller[95460]: 2025-11-25T19:07:25Z|00106|binding|INFO|Setting lport d57a8988-8300-46ea-a464-f60cefb0a63c down in Southbound
Nov 25 14:07:25 np0005535656 ovn_controller[95460]: 2025-11-25T19:07:25Z|00107|binding|INFO|Removing iface tapd57a8988-83 ovn-installed in OVS
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.004 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.007 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.017 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:5b:ca 10.100.0.12'], port_security=['fa:16:3e:05:5b:ca 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f3eec114-ab49-46a4-93df-5f391c88194f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=d57a8988-8300-46ea-a464-f60cefb0a63c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.019 104346 INFO neutron.agent.ovn.metadata.agent [-] Port d57a8988-8300-46ea-a464-f60cefb0a63c in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.022 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.023 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1b37ef42-0791-441a-8ec5-1161f1645f68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.024 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.037 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 25 14:07:25 np0005535656 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 15.797s CPU time.
Nov 25 14:07:25 np0005535656 systemd-machined[153481]: Machine qemu-9-instance-0000000d terminated.
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [NOTICE]   (213619) : haproxy version is 2.8.14-c23fe91
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [NOTICE]   (213619) : path to executable is /usr/sbin/haproxy
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [WARNING]  (213619) : Exiting Master process...
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [WARNING]  (213619) : Exiting Master process...
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [ALERT]    (213619) : Current worker (213621) exited with code 143 (Terminated)
Nov 25 14:07:25 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[213615]: [WARNING]  (213619) : All workers exited. Exiting... (0)
Nov 25 14:07:25 np0005535656 systemd[1]: libpod-b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71.scope: Deactivated successfully.
Nov 25 14:07:25 np0005535656 podman[213945]: 2025-11-25 19:07:25.189781675 +0000 UTC m=+0.058902443 container died b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.253 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71-userdata-shm.mount: Deactivated successfully.
Nov 25 14:07:25 np0005535656 systemd[1]: var-lib-containers-storage-overlay-cbc8bf84e6187c1b568c9ccb9e21ba7ae16eb67364ec0d18f5860823b2e87a2f-merged.mount: Deactivated successfully.
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.267 187223 DEBUG nova.compute.manager [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.268 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.268 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.268 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.269 187223 DEBUG nova.compute.manager [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.270 187223 WARNING nova.compute.manager [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.270 187223 DEBUG nova.compute.manager [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-changed-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.270 187223 DEBUG nova.compute.manager [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Refreshing instance network info cache due to event network-changed-d57a8988-8300-46ea-a464-f60cefb0a63c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.271 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.271 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.271 187223 DEBUG nova.network.neutron [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Refreshing network info cache for port d57a8988-8300-46ea-a464-f60cefb0a63c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:07:25 np0005535656 podman[213945]: 2025-11-25 19:07:25.271847528 +0000 UTC m=+0.140968296 container cleanup b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:07:25 np0005535656 systemd[1]: libpod-conmon-b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71.scope: Deactivated successfully.
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.290 187223 DEBUG nova.virt.libvirt.guest [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.290 187223 INFO nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migration operation has completed#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.291 187223 INFO nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] _post_live_migration() is started..#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.302 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.302 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.302 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:07:25 np0005535656 podman[213990]: 2025-11-25 19:07:25.343521462 +0000 UTC m=+0.041703270 container remove b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.348 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[609c6378-f639-4a21-9bfe-c7adfbf1a4fc]: (4, ('Tue Nov 25 07:07:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71)\nb434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71\nTue Nov 25 07:07:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (b434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71)\nb434c6912fbaa6408e3b69c89ca846b55039722af41f026457bc362da5571f71\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.349 187223 DEBUG nova.compute.manager [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.350 187223 DEBUG oslo_concurrency.lockutils [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.350 187223 DEBUG oslo_concurrency.lockutils [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.350 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4018e90d-0540-4e02-b297-dc9ab6db694b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.350 187223 DEBUG oslo_concurrency.lockutils [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.350 187223 DEBUG nova.compute.manager [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.351 187223 DEBUG nova.compute.manager [req-578eeb40-b135-4969-abe3-28be9ed54a4c req-ce3cb949-0e8b-4dcf-975a-820bf5c7751b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.351 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.352 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:07:25 np0005535656 nova_compute[187219]: 2025-11-25 19:07:25.367 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.369 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[65937d8e-fcaf-4d98-8c4a-2427e4209548]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.383 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[df1c5097-e384-436b-8cfb-7f51600ebd43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.384 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b24e77-efed-4a62-baf3-b5fe8a1ce1f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.400 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[600f836e-2e97-466f-89bf-2df1426ff6b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454592, 'reachable_time': 23476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214011, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:25 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.403 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:07:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:25.404 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[052a9e60-6b43-42fb-af6f-c3e8a6c40cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.000 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.084 187223 DEBUG nova.network.neutron [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port d57a8988-8300-46ea-a464-f60cefb0a63c and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.084 187223 DEBUG nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.085 187223 DEBUG nova.virt.libvirt.vif [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:06:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-214460348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-214460348',id=13,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:06:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-d8oi41kt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:07:10Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=f3eec114-ab49-46a4-93df-5f391c88194f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.085 187223 DEBUG nova.network.os_vif_util [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.086 187223 DEBUG nova.network.os_vif_util [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.086 187223 DEBUG os_vif [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.087 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.087 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd57a8988-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.089 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.090 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.092 187223 INFO os_vif [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:5b:ca,bridge_name='br-int',has_traffic_filtering=True,id=d57a8988-8300-46ea-a464-f60cefb0a63c,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd57a8988-83')#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.093 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.093 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.093 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.093 187223 DEBUG nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.094 187223 INFO nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Deleting instance files /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f_del#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.094 187223 INFO nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Deletion of /var/lib/nova/instances/f3eec114-ab49-46a4-93df-5f391c88194f_del complete#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.637 187223 DEBUG nova.network.neutron [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updated VIF entry in instance network info cache for port d57a8988-8300-46ea-a464-f60cefb0a63c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.637 187223 DEBUG nova.network.neutron [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Updating instance_info_cache with network_info: [{"id": "d57a8988-8300-46ea-a464-f60cefb0a63c", "address": "fa:16:3e:05:5b:ca", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd57a8988-83", "ovs_interfaceid": "d57a8988-8300-46ea-a464-f60cefb0a63c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:07:26 np0005535656 nova_compute[187219]: 2025-11-25 19:07:26.656 187223 DEBUG oslo_concurrency.lockutils [req-e78d49f0-f653-49e5-a11d-73c51b76bdaf req-592ee0c2-266f-4d52-9afe-6995af76cea3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-f3eec114-ab49-46a4-93df-5f391c88194f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.430 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.430 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.430 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.430 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.430 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.431 187223 WARNING nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.431 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.431 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.431 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.431 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.432 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.432 187223 WARNING nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.432 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.432 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.432 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.433 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.433 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.433 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-unplugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.433 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.433 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 WARNING nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.434 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.435 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.435 187223 DEBUG oslo_concurrency.lockutils [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.435 187223 DEBUG nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] No waiting events found dispatching network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:07:27 np0005535656 nova_compute[187219]: 2025-11-25 19:07:27.435 187223 WARNING nova.compute.manager [req-5ba8bef4-8909-4dd3-bcaf-d8e1878723c0 req-a1d671ba-5884-4afc-9cc0-98a10db8add6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Received unexpected event network-vif-plugged-d57a8988-8300-46ea-a464-f60cefb0a63c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.180 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.181 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.182 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f3eec114-ab49-46a4-93df-5f391c88194f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.201 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.201 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.202 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.202 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.384 187223 WARNING nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.385 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5869MB free_disk=73.16388702392578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.385 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.385 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.438 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance f3eec114-ab49-46a4-93df-5f391c88194f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.466 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.522 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration 92313ac9-168a-4e41-9d0c-46a3f8ac700b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.523 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.523 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.563 187223 DEBUG nova.compute.provider_tree [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.575 187223 DEBUG nova.scheduler.client.report [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.592 187223 DEBUG nova.compute.resource_tracker [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.593 187223 DEBUG oslo_concurrency.lockutils [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.597 187223 INFO nova.compute.manager [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.683 187223 INFO nova.scheduler.client.report [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 92313ac9-168a-4e41-9d0c-46a3f8ac700b#033[00m
Nov 25 14:07:30 np0005535656 nova_compute[187219]: 2025-11-25 19:07:30.684 187223 DEBUG nova.virt.libvirt.driver [None req-153f6b15-1a83-4647-8954-556cb9b55060 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:07:31 np0005535656 nova_compute[187219]: 2025-11-25 19:07:31.004 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:31 np0005535656 nova_compute[187219]: 2025-11-25 19:07:31.090 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:31 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:07:31 np0005535656 systemd[213870]: Activating special unit Exit the Session...
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped target Main User Target.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped target Basic System.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped target Paths.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped target Sockets.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped target Timers.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:07:31 np0005535656 systemd[213870]: Closed D-Bus User Message Bus Socket.
Nov 25 14:07:31 np0005535656 systemd[213870]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:07:31 np0005535656 systemd[213870]: Removed slice User Application Slice.
Nov 25 14:07:31 np0005535656 systemd[213870]: Reached target Shutdown.
Nov 25 14:07:31 np0005535656 systemd[213870]: Finished Exit the Session.
Nov 25 14:07:31 np0005535656 systemd[213870]: Reached target Exit the Session.
Nov 25 14:07:31 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:07:31 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:07:31 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:07:31 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:07:31 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:07:31 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:07:31 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:07:35 np0005535656 podman[197580]: time="2025-11-25T19:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:07:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:07:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 25 14:07:36 np0005535656 nova_compute[187219]: 2025-11-25 19:07:36.007 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:36 np0005535656 nova_compute[187219]: 2025-11-25 19:07:36.093 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:36 np0005535656 podman[214017]: 2025-11-25 19:07:36.21394322 +0000 UTC m=+0.079061023 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:07:40 np0005535656 nova_compute[187219]: 2025-11-25 19:07:40.297 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097645.289849, f3eec114-ab49-46a4-93df-5f391c88194f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:07:40 np0005535656 nova_compute[187219]: 2025-11-25 19:07:40.298 187223 INFO nova.compute.manager [-] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:07:40 np0005535656 nova_compute[187219]: 2025-11-25 19:07:40.327 187223 DEBUG nova.compute.manager [None req-4c2e8d3b-7085-41cd-984b-c5d1f4163d2d - - - - - -] [instance: f3eec114-ab49-46a4-93df-5f391c88194f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:07:41 np0005535656 nova_compute[187219]: 2025-11-25 19:07:41.009 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:41 np0005535656 nova_compute[187219]: 2025-11-25 19:07:41.095 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:44 np0005535656 podman[214042]: 2025-11-25 19:07:44.988404415 +0000 UTC m=+0.090217192 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 14:07:45 np0005535656 podman[214041]: 2025-11-25 19:07:45.042292431 +0000 UTC m=+0.154570670 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:07:46 np0005535656 nova_compute[187219]: 2025-11-25 19:07:46.010 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:46 np0005535656 nova_compute[187219]: 2025-11-25 19:07:46.096 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:07:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:07:50 np0005535656 podman[214086]: 2025-11-25 19:07:50.982284567 +0000 UTC m=+0.089651688 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6)
Nov 25 14:07:51 np0005535656 nova_compute[187219]: 2025-11-25 19:07:51.013 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:51 np0005535656 nova_compute[187219]: 2025-11-25 19:07:51.098 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:54 np0005535656 podman[214107]: 2025-11-25 19:07:54.972093814 +0000 UTC m=+0.091477776 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:07:56 np0005535656 nova_compute[187219]: 2025-11-25 19:07:56.014 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:56 np0005535656 nova_compute[187219]: 2025-11-25 19:07:56.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:07:56 np0005535656 nova_compute[187219]: 2025-11-25 19:07:56.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:59.083 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:07:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:59.084 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:07:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:07:59.084 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:07:59 np0005535656 nova_compute[187219]: 2025-11-25 19:07:59.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:07:59 np0005535656 nova_compute[187219]: 2025-11-25 19:07:59.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:07:59 np0005535656 nova_compute[187219]: 2025-11-25 19:07:59.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:07:59 np0005535656 nova_compute[187219]: 2025-11-25 19:07:59.688 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:08:00 np0005535656 nova_compute[187219]: 2025-11-25 19:08:00.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:00 np0005535656 nova_compute[187219]: 2025-11-25 19:08:00.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:01 np0005535656 nova_compute[187219]: 2025-11-25 19:08:01.014 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:01 np0005535656 nova_compute[187219]: 2025-11-25 19:08:01.102 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:03 np0005535656 nova_compute[187219]: 2025-11-25 19:08:03.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:03 np0005535656 nova_compute[187219]: 2025-11-25 19:08:03.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:08:05 np0005535656 podman[197580]: time="2025-11-25T19:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:08:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:08:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 25 14:08:05 np0005535656 nova_compute[187219]: 2025-11-25 19:08:05.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:06 np0005535656 nova_compute[187219]: 2025-11-25 19:08:06.017 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:06 np0005535656 nova_compute[187219]: 2025-11-25 19:08:06.103 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:06 np0005535656 nova_compute[187219]: 2025-11-25 19:08:06.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:06 np0005535656 podman[214128]: 2025-11-25 19:08:06.926807254 +0000 UTC m=+0.048742380 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.696 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.696 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.697 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.697 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.851 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.852 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5898MB free_disk=73.16390609741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.852 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:07 np0005535656 nova_compute[187219]: 2025-11-25 19:08:07.852 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.095 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.095 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.115 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.157 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.157 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.174 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.193 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STATUS_DISABLED,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.210 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.224 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.225 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:08:08 np0005535656 nova_compute[187219]: 2025-11-25 19:08:08.225 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:09 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:09Z|00108|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Nov 25 14:08:11 np0005535656 nova_compute[187219]: 2025-11-25 19:08:11.018 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:11 np0005535656 nova_compute[187219]: 2025-11-25 19:08:11.104 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:11 np0005535656 nova_compute[187219]: 2025-11-25 19:08:11.226 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:13 np0005535656 nova_compute[187219]: 2025-11-25 19:08:13.022 187223 DEBUG nova.compute.manager [None req-ce2d0cf4-3539-414f-b14b-8eedb87d3947 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606#033[00m
Nov 25 14:08:13 np0005535656 nova_compute[187219]: 2025-11-25 19:08:13.106 187223 DEBUG nova.compute.provider_tree [None req-ce2d0cf4-3539-414f-b14b-8eedb87d3947 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 27 to 28 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:08:15 np0005535656 podman[214152]: 2025-11-25 19:08:15.982782907 +0000 UTC m=+0.094562500 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 25 14:08:16 np0005535656 podman[214151]: 2025-11-25 19:08:16.010186153 +0000 UTC m=+0.134895783 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 14:08:16 np0005535656 nova_compute[187219]: 2025-11-25 19:08:16.062 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:16 np0005535656 nova_compute[187219]: 2025-11-25 19:08:16.106 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:08:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:08:21 np0005535656 nova_compute[187219]: 2025-11-25 19:08:21.065 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:21 np0005535656 nova_compute[187219]: 2025-11-25 19:08:21.107 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:21 np0005535656 podman[214197]: 2025-11-25 19:08:21.984617373 +0000 UTC m=+0.090769088 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 25 14:08:25 np0005535656 podman[214218]: 2025-11-25 19:08:25.984978656 +0000 UTC m=+0.099020150 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 14:08:26 np0005535656 nova_compute[187219]: 2025-11-25 19:08:26.067 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:26 np0005535656 nova_compute[187219]: 2025-11-25 19:08:26.109 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:28.506 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:08:28 np0005535656 nova_compute[187219]: 2025-11-25 19:08:28.507 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:28.507 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:08:31 np0005535656 nova_compute[187219]: 2025-11-25 19:08:31.068 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:31 np0005535656 nova_compute[187219]: 2025-11-25 19:08:31.111 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:31 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:31.510 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:35 np0005535656 podman[197580]: time="2025-11-25T19:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:08:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:08:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 25 14:08:36 np0005535656 nova_compute[187219]: 2025-11-25 19:08:36.070 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:36 np0005535656 nova_compute[187219]: 2025-11-25 19:08:36.113 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:37 np0005535656 podman[214239]: 2025-11-25 19:08:37.923307085 +0000 UTC m=+0.052360198 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.324 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.324 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.344 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.430 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.430 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.443 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.443 187223 INFO nova.compute.claims [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.565 187223 DEBUG nova.compute.provider_tree [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.578 187223 DEBUG nova.scheduler.client.report [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.598 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.598 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.642 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.643 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.659 187223 INFO nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.687 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.809 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.810 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.811 187223 INFO nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Creating image(s)#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.812 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.812 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.813 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.829 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.900 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.901 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.901 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.913 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.935 187223 DEBUG nova.policy [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.993 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:38 np0005535656 nova_compute[187219]: 2025-11-25 19:08:38.994 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.027 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.029 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.030 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.116 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.118 187223 DEBUG nova.virt.disk.api [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.118 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.198 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.200 187223 DEBUG nova.virt.disk.api [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.200 187223 DEBUG nova.objects.instance [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid 708a51ee-14d7-4511-ab36-5798d1c8de28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.216 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.217 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Ensure instance console log exists: /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.218 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.218 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.219 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:39 np0005535656 nova_compute[187219]: 2025-11-25 19:08:39.599 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Successfully created port: 4e91cc02-2569-488e-b88d-1a635ca9e1fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:08:40 np0005535656 nova_compute[187219]: 2025-11-25 19:08:40.982 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Successfully updated port: 4e91cc02-2569-488e-b88d-1a635ca9e1fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:08:40 np0005535656 nova_compute[187219]: 2025-11-25 19:08:40.998 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:08:40 np0005535656 nova_compute[187219]: 2025-11-25 19:08:40.999 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:08:40 np0005535656 nova_compute[187219]: 2025-11-25 19:08:40.999 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.063 187223 DEBUG nova.compute.manager [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-changed-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.064 187223 DEBUG nova.compute.manager [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Refreshing instance network info cache due to event network-changed-4e91cc02-2569-488e-b88d-1a635ca9e1fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.065 187223 DEBUG oslo_concurrency.lockutils [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.093 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.115 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:41 np0005535656 nova_compute[187219]: 2025-11-25 19:08:41.124 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.075 187223 DEBUG nova.network.neutron [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updating instance_info_cache with network_info: [{"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.092 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.093 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Instance network_info: |[{"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.093 187223 DEBUG oslo_concurrency.lockutils [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.093 187223 DEBUG nova.network.neutron [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Refreshing network info cache for port 4e91cc02-2569-488e-b88d-1a635ca9e1fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.097 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Start _get_guest_xml network_info=[{"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.103 187223 WARNING nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.115 187223 DEBUG nova.virt.libvirt.host [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.116 187223 DEBUG nova.virt.libvirt.host [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.119 187223 DEBUG nova.virt.libvirt.host [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.120 187223 DEBUG nova.virt.libvirt.host [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.122 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.122 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.123 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.124 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.124 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.125 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.125 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.125 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.126 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.126 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.127 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.127 187223 DEBUG nova.virt.hardware [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.134 187223 DEBUG nova.virt.libvirt.vif [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-343208568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-343208568',id=16,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-44wciidh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:08:38Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=708a51ee-14d7-4511-ab36-5798d1c8de28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.135 187223 DEBUG nova.network.os_vif_util [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.136 187223 DEBUG nova.network.os_vif_util [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.137 187223 DEBUG nova.objects.instance [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid 708a51ee-14d7-4511-ab36-5798d1c8de28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.156 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <uuid>708a51ee-14d7-4511-ab36-5798d1c8de28</uuid>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <name>instance-00000010</name>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-343208568</nova:name>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:08:42</nova:creationTime>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        <nova:port uuid="4e91cc02-2569-488e-b88d-1a635ca9e1fa">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="serial">708a51ee-14d7-4511-ab36-5798d1c8de28</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="uuid">708a51ee-14d7-4511-ab36-5798d1c8de28</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.config"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:4f:2c:ab"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <target dev="tap4e91cc02-25"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/console.log" append="off"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:08:42 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:08:42 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:08:42 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:08:42 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.158 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Preparing to wait for external event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.158 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.159 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.159 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.160 187223 DEBUG nova.virt.libvirt.vif [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-343208568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-343208568',id=16,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-44wciidh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:08:38Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=708a51ee-14d7-4511-ab36-5798d1c8de28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.160 187223 DEBUG nova.network.os_vif_util [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.161 187223 DEBUG nova.network.os_vif_util [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.161 187223 DEBUG os_vif [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.162 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.162 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.163 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.167 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.168 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e91cc02-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.169 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4e91cc02-25, col_values=(('external_ids', {'iface-id': '4e91cc02-2569-488e-b88d-1a635ca9e1fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:2c:ab', 'vm-uuid': '708a51ee-14d7-4511-ab36-5798d1c8de28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.172 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.1736] manager: (tap4e91cc02-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.176 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.183 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.185 187223 INFO os_vif [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25')#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.234 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.235 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.237 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:4f:2c:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.238 187223 INFO nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Using config drive#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.552 187223 INFO nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Creating config drive at /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.config#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.557 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpayoqpn8y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.682 187223 DEBUG oslo_concurrency.processutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpayoqpn8y" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:08:42 np0005535656 kernel: tap4e91cc02-25: entered promiscuous mode
Nov 25 14:08:42 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:42Z|00109|binding|INFO|Claiming lport 4e91cc02-2569-488e-b88d-1a635ca9e1fa for this chassis.
Nov 25 14:08:42 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:42Z|00110|binding|INFO|4e91cc02-2569-488e-b88d-1a635ca9e1fa: Claiming fa:16:3e:4f:2c:ab 10.100.0.12
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.739 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.7404] manager: (tap4e91cc02-25): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Nov 25 14:08:42 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:42Z|00111|binding|INFO|Setting lport 4e91cc02-2569-488e-b88d-1a635ca9e1fa ovn-installed in OVS
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.751 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:2c:ab 10.100.0.12'], port_security=['fa:16:3e:4f:2c:ab 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '708a51ee-14d7-4511-ab36-5798d1c8de28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=4e91cc02-2569-488e-b88d-1a635ca9e1fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.751 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:42Z|00112|binding|INFO|Setting lport 4e91cc02-2569-488e-b88d-1a635ca9e1fa up in Southbound
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.751 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.752 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 4e91cc02-2569-488e-b88d-1a635ca9e1fa in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.753 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.753 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.757 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:42 np0005535656 systemd-udevd[214297]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.764 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[40c5ba40-9125-4ec9-9606-3446f300a28e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.764 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.767 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.767 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[44b062a6-3e93-4eb4-bc39-a03365fa5853]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.768 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[98c6cde5-f878-4519-9eb2-03ba122aac47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 systemd-machined[153481]: New machine qemu-10-instance-00000010.
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.7764] device (tap4e91cc02-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.7775] device (tap4e91cc02-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.778 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[450f68a2-0bfb-4736-8690-53642a75eb9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 systemd[1]: Started Virtual Machine qemu-10-instance-00000010.
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.800 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc07154-c9b8-48f5-a1c5-5e21147b89f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.823 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[bda130f0-7b07-4263-9cda-2e8526e40d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.828 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f855f420-86d6-459c-a21b-8768b4e831cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.8298] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.857 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[96df263a-adfb-4abf-92ac-7df2ac81e89d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.860 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[2c867861-e469-49f4-955a-521f7d1211a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 NetworkManager[55548]: <info>  [1764097722.8812] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.886 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[3369a37a-b014-4248-9356-1b574ebcf0f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.901 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bd94bb85-bc46-4690-a79b-e121f533a581]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468440, 'reachable_time': 34504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214330, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.907 187223 DEBUG nova.compute.manager [req-23a7d506-4171-4d4b-870f-937f8fe55931 req-a2f9e146-0ad8-4daf-bc2c-280673460d7f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.907 187223 DEBUG oslo_concurrency.lockutils [req-23a7d506-4171-4d4b-870f-937f8fe55931 req-a2f9e146-0ad8-4daf-bc2c-280673460d7f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.908 187223 DEBUG oslo_concurrency.lockutils [req-23a7d506-4171-4d4b-870f-937f8fe55931 req-a2f9e146-0ad8-4daf-bc2c-280673460d7f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.908 187223 DEBUG oslo_concurrency.lockutils [req-23a7d506-4171-4d4b-870f-937f8fe55931 req-a2f9e146-0ad8-4daf-bc2c-280673460d7f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:42 np0005535656 nova_compute[187219]: 2025-11-25 19:08:42.908 187223 DEBUG nova.compute.manager [req-23a7d506-4171-4d4b-870f-937f8fe55931 req-a2f9e146-0ad8-4daf-bc2c-280673460d7f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Processing event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.919 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3541cd-3df4-4e6a-b7cc-5475dbc6ba83]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468440, 'tstamp': 468440}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214331, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.936 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2e3028-f86a-40fb-b135-e7b07834d510]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468440, 'reachable_time': 34504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214332, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:42 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:42.968 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[404efb14-bdc7-4d19-8374-1661d76d491b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.036 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ac2183-7c7b-417d-971c-977ebdcb8e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.038 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.039 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.040 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.042 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:43 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:08:43 np0005535656 NetworkManager[55548]: <info>  [1764097723.0445] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.045 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.047 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:43 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:43Z|00113|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.056 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.058 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.059 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[15094fcd-711b-43dd-ae68-564f9390aaba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.060 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:08:43 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:43.061 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.235 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097723.235447, 708a51ee-14d7-4511-ab36-5798d1c8de28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.237 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] VM Started (Lifecycle Event)#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.238 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.243 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.246 187223 INFO nova.virt.libvirt.driver [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Instance spawned successfully.#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.246 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.282 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.286 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.289 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.289 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.290 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.290 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.290 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.291 187223 DEBUG nova.virt.libvirt.driver [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.330 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.331 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097723.2363372, 708a51ee-14d7-4511-ab36-5798d1c8de28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.331 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.350 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.353 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097723.2416763, 708a51ee-14d7-4511-ab36-5798d1c8de28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.353 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.357 187223 INFO nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Took 4.55 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.357 187223 DEBUG nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.368 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.371 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.392 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:08:43 np0005535656 podman[214372]: 2025-11-25 19:08:43.408654775 +0000 UTC m=+0.047514807 container create 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.414 187223 INFO nova.compute.manager [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Took 5.02 seconds to build instance.#033[00m
Nov 25 14:08:43 np0005535656 nova_compute[187219]: 2025-11-25 19:08:43.430 187223 DEBUG oslo_concurrency.lockutils [None req-e63dfcbb-c744-4f0b-8007-73fe05052d5b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:43 np0005535656 systemd[1]: Started libpod-conmon-5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa.scope.
Nov 25 14:08:43 np0005535656 podman[214372]: 2025-11-25 19:08:43.383274253 +0000 UTC m=+0.022134305 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:08:43 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:08:43 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bef66b16b8e480b12b279ce626f2e60d249aee0cd945d73d574c520dad8030f1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:08:43 np0005535656 podman[214372]: 2025-11-25 19:08:43.524631518 +0000 UTC m=+0.163491590 container init 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:08:43 np0005535656 podman[214372]: 2025-11-25 19:08:43.534664918 +0000 UTC m=+0.173524960 container start 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 14:08:43 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [NOTICE]   (214392) : New worker (214394) forked
Nov 25 14:08:43 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [NOTICE]   (214392) : Loading success.
Nov 25 14:08:44 np0005535656 nova_compute[187219]: 2025-11-25 19:08:44.999 187223 DEBUG nova.network.neutron [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updated VIF entry in instance network info cache for port 4e91cc02-2569-488e-b88d-1a635ca9e1fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:08:45 np0005535656 nova_compute[187219]: 2025-11-25 19:08:45.000 187223 DEBUG nova.network.neutron [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updating instance_info_cache with network_info: [{"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:08:45 np0005535656 nova_compute[187219]: 2025-11-25 19:08:45.022 187223 DEBUG oslo_concurrency.lockutils [req-f374c83e-1225-4ada-974e-5adfd95eb9e6 req-d266820a-a047-43d8-98a4-e48869191d10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.096 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.630 187223 DEBUG nova.compute.manager [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.630 187223 DEBUG oslo_concurrency.lockutils [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.631 187223 DEBUG oslo_concurrency.lockutils [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.631 187223 DEBUG oslo_concurrency.lockutils [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.631 187223 DEBUG nova.compute.manager [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] No waiting events found dispatching network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:08:46 np0005535656 nova_compute[187219]: 2025-11-25 19:08:46.632 187223 WARNING nova.compute.manager [req-cd8dcdeb-d2bf-4b81-ad0a-db4ea3b64014 req-31fc2153-f806-4fb0-b911-46ef898079a9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received unexpected event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa for instance with vm_state active and task_state None.#033[00m
Nov 25 14:08:46 np0005535656 podman[214403]: 2025-11-25 19:08:46.952512129 +0000 UTC m=+0.079830064 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:08:46 np0005535656 podman[214404]: 2025-11-25 19:08:46.958682736 +0000 UTC m=+0.079741652 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:08:47 np0005535656 nova_compute[187219]: 2025-11-25 19:08:47.172 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:08:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:08:51 np0005535656 nova_compute[187219]: 2025-11-25 19:08:51.097 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:52 np0005535656 nova_compute[187219]: 2025-11-25 19:08:52.175 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:52 np0005535656 podman[214448]: 2025-11-25 19:08:52.934507714 +0000 UTC m=+0.053292502 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=)
Nov 25 14:08:56 np0005535656 nova_compute[187219]: 2025-11-25 19:08:56.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:56 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:56Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:2c:ab 10.100.0.12
Nov 25 14:08:56 np0005535656 ovn_controller[95460]: 2025-11-25T19:08:56Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:2c:ab 10.100.0.12
Nov 25 14:08:56 np0005535656 podman[214486]: 2025-11-25 19:08:56.942176591 +0000 UTC m=+0.057469744 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd)
Nov 25 14:08:57 np0005535656 nova_compute[187219]: 2025-11-25 19:08:57.179 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:08:58 np0005535656 nova_compute[187219]: 2025-11-25 19:08:58.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:08:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:59.085 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:08:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:59.086 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:08:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:08:59.087 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.101 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.907 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.907 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.907 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:09:01 np0005535656 nova_compute[187219]: 2025-11-25 19:09:01.908 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 708a51ee-14d7-4511-ab36-5798d1c8de28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:09:02 np0005535656 nova_compute[187219]: 2025-11-25 19:09:02.182 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:02 np0005535656 nova_compute[187219]: 2025-11-25 19:09:02.911 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updating instance_info_cache with network_info: [{"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:09:02 np0005535656 nova_compute[187219]: 2025-11-25 19:09:02.930 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-708a51ee-14d7-4511-ab36-5798d1c8de28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:09:02 np0005535656 nova_compute[187219]: 2025-11-25 19:09:02.931 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:09:02 np0005535656 nova_compute[187219]: 2025-11-25 19:09:02.932 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:03 np0005535656 nova_compute[187219]: 2025-11-25 19:09:03.928 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:04 np0005535656 nova_compute[187219]: 2025-11-25 19:09:04.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:04 np0005535656 nova_compute[187219]: 2025-11-25 19:09:04.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:09:05 np0005535656 podman[197580]: time="2025-11-25T19:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:09:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:09:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 25 14:09:06 np0005535656 nova_compute[187219]: 2025-11-25 19:09:06.103 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:06 np0005535656 nova_compute[187219]: 2025-11-25 19:09:06.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:06 np0005535656 nova_compute[187219]: 2025-11-25 19:09:06.701 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:07 np0005535656 nova_compute[187219]: 2025-11-25 19:09:07.184 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:08 np0005535656 nova_compute[187219]: 2025-11-25 19:09:08.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:08 np0005535656 podman[214507]: 2025-11-25 19:09:08.922185809 +0000 UTC m=+0.050542618 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.692 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.692 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.692 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.692 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.765 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.850 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.851 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:09 np0005535656 nova_compute[187219]: 2025-11-25 19:09:09.907 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.070 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.071 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5716MB free_disk=73.13518142700195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.071 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.072 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.165 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 708a51ee-14d7-4511-ab36-5798d1c8de28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.165 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.166 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.209 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.221 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.242 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:09:10 np0005535656 nova_compute[187219]: 2025-11-25 19:09:10.243 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:11 np0005535656 nova_compute[187219]: 2025-11-25 19:09:11.105 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:11 np0005535656 nova_compute[187219]: 2025-11-25 19:09:11.243 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:12 np0005535656 nova_compute[187219]: 2025-11-25 19:09:12.186 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:12 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:12Z|00114|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Nov 25 14:09:14 np0005535656 nova_compute[187219]: 2025-11-25 19:09:14.319 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Creating tmpfile /var/lib/nova/instances/tmpc0t_jnyn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 25 14:09:14 np0005535656 nova_compute[187219]: 2025-11-25 19:09:14.319 187223 DEBUG nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc0t_jnyn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 25 14:09:15 np0005535656 nova_compute[187219]: 2025-11-25 19:09:15.038 187223 DEBUG nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc0t_jnyn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dbbbbc94-b53b-46db-b612-cc535b34fecc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 25 14:09:15 np0005535656 nova_compute[187219]: 2025-11-25 19:09:15.075 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:09:15 np0005535656 nova_compute[187219]: 2025-11-25 19:09:15.076 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:09:15 np0005535656 nova_compute[187219]: 2025-11-25 19:09:15.076 187223 DEBUG nova.network.neutron [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:09:16 np0005535656 nova_compute[187219]: 2025-11-25 19:09:16.107 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.188 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.268 187223 DEBUG nova.network.neutron [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Updating instance_info_cache with network_info: [{"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.296 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.299 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc0t_jnyn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dbbbbc94-b53b-46db-b612-cc535b34fecc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.300 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Creating instance directory: /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.301 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Creating disk.info with the contents: {'/var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk': 'qcow2', '/var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.301 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.302 187223 DEBUG nova.objects.instance [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid dbbbbc94-b53b-46db-b612-cc535b34fecc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.330 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.383 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.384 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.385 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.394 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.470 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.471 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.864 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk 1073741824" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.866 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.867 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.946 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.947 187223 DEBUG nova.virt.disk.api [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Checking if we can resize image /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:09:17 np0005535656 nova_compute[187219]: 2025-11-25 19:09:17.947 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:17 np0005535656 podman[214549]: 2025-11-25 19:09:17.982572381 +0000 UTC m=+0.080600306 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.002 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.003 187223 DEBUG nova.virt.disk.api [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Cannot resize image /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.004 187223 DEBUG nova.objects.instance [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid dbbbbc94-b53b-46db-b612-cc535b34fecc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:09:18 np0005535656 podman[214547]: 2025-11-25 19:09:18.01237211 +0000 UTC m=+0.119155771 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.022 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.043 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.045 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config to /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.045 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.522 187223 DEBUG oslo_concurrency.processutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc/disk.config /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.523 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.524 187223 DEBUG nova.virt.libvirt.vif [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1456001434',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1456001434',id=15,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:08:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-wkid6hcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:08:29Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=dbbbbc94-b53b-46db-b612-cc535b34fecc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.524 187223 DEBUG nova.network.os_vif_util [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.525 187223 DEBUG nova.network.os_vif_util [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.526 187223 DEBUG os_vif [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.527 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.527 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.528 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.530 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.531 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fa3200a-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.531 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7fa3200a-d4, col_values=(('external_ids', {'iface-id': '7fa3200a-d4a1-49a5-99cf-2d0b7c75720e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:8a:59', 'vm-uuid': 'dbbbbc94-b53b-46db-b612-cc535b34fecc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.533 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:18 np0005535656 NetworkManager[55548]: <info>  [1764097758.5348] manager: (tap7fa3200a-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.537 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.540 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.541 187223 INFO os_vif [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4')#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.542 187223 DEBUG nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 25 14:09:18 np0005535656 nova_compute[187219]: 2025-11-25 19:09:18.542 187223 DEBUG nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc0t_jnyn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dbbbbc94-b53b-46db-b612-cc535b34fecc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:09:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:09:19 np0005535656 nova_compute[187219]: 2025-11-25 19:09:19.975 187223 DEBUG nova.network.neutron [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Port 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 25 14:09:19 np0005535656 nova_compute[187219]: 2025-11-25 19:09:19.977 187223 DEBUG nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc0t_jnyn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dbbbbc94-b53b-46db-b612-cc535b34fecc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 25 14:09:20 np0005535656 systemd[1]: Starting libvirt proxy daemon...
Nov 25 14:09:20 np0005535656 systemd[1]: Started libvirt proxy daemon.
Nov 25 14:09:20 np0005535656 kernel: tap7fa3200a-d4: entered promiscuous mode
Nov 25 14:09:20 np0005535656 NetworkManager[55548]: <info>  [1764097760.2941] manager: (tap7fa3200a-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Nov 25 14:09:20 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:20Z|00115|binding|INFO|Claiming lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e for this additional chassis.
Nov 25 14:09:20 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:20Z|00116|binding|INFO|7fa3200a-d4a1-49a5-99cf-2d0b7c75720e: Claiming fa:16:3e:a0:8a:59 10.100.0.5
Nov 25 14:09:20 np0005535656 nova_compute[187219]: 2025-11-25 19:09:20.293 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:20 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:20Z|00117|binding|INFO|Setting lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e ovn-installed in OVS
Nov 25 14:09:20 np0005535656 nova_compute[187219]: 2025-11-25 19:09:20.307 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:20 np0005535656 nova_compute[187219]: 2025-11-25 19:09:20.308 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:20 np0005535656 systemd-udevd[214637]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:09:20 np0005535656 systemd-machined[153481]: New machine qemu-11-instance-0000000f.
Nov 25 14:09:20 np0005535656 NetworkManager[55548]: <info>  [1764097760.3352] device (tap7fa3200a-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:09:20 np0005535656 NetworkManager[55548]: <info>  [1764097760.3361] device (tap7fa3200a-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:09:20 np0005535656 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Nov 25 14:09:21 np0005535656 nova_compute[187219]: 2025-11-25 19:09:21.109 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:22 np0005535656 nova_compute[187219]: 2025-11-25 19:09:22.320 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097762.320005, dbbbbc94-b53b-46db-b612-cc535b34fecc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:09:22 np0005535656 nova_compute[187219]: 2025-11-25 19:09:22.321 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] VM Started (Lifecycle Event)#033[00m
Nov 25 14:09:22 np0005535656 nova_compute[187219]: 2025-11-25 19:09:22.344 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.336 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097763.3356748, dbbbbc94-b53b-46db-b612-cc535b34fecc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.336 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.382 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.385 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.422 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 25 14:09:23 np0005535656 nova_compute[187219]: 2025-11-25 19:09:23.533 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:24 np0005535656 podman[214664]: 2025-11-25 19:09:24.016297342 +0000 UTC m=+0.120615169 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9)
Nov 25 14:09:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:24Z|00118|binding|INFO|Claiming lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e for this chassis.
Nov 25 14:09:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:24Z|00119|binding|INFO|7fa3200a-d4a1-49a5-99cf-2d0b7c75720e: Claiming fa:16:3e:a0:8a:59 10.100.0.5
Nov 25 14:09:24 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:24Z|00120|binding|INFO|Setting lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e up in Southbound
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.382 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:8a:59 10.100.0.5'], port_security=['fa:16:3e:a0:8a:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbbbbc94-b53b-46db-b612-cc535b34fecc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '11', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.383 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.384 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.398 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8bca1cd6-9e5b-4e6a-9eb2-1ab578fb9db0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.425 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[bb254806-c603-4eb1-94e5-fbe84be125c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.428 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[56d2dfca-f411-4277-a68d-58f34c4b5328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.451 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4f57af-9356-4b17-87c6-18af5d7ebf49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.466 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[51b630d8-f169-44ef-a765-c27de2e22391]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468440, 'reachable_time': 31699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214691, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.481 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1404c4-ac88-45e0-85ed-d390abf095e3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e881e87-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468452, 'tstamp': 468452}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214692, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e881e87-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468455, 'tstamp': 468455}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214692, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.483 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:24 np0005535656 nova_compute[187219]: 2025-11-25 19:09:24.484 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:24 np0005535656 nova_compute[187219]: 2025-11-25 19:09:24.486 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.486 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:24.487 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:09:24 np0005535656 nova_compute[187219]: 2025-11-25 19:09:24.532 187223 INFO nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Post operation of migration started#033[00m
Nov 25 14:09:25 np0005535656 nova_compute[187219]: 2025-11-25 19:09:25.139 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:09:25 np0005535656 nova_compute[187219]: 2025-11-25 19:09:25.140 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:09:25 np0005535656 nova_compute[187219]: 2025-11-25 19:09:25.140 187223 DEBUG nova.network.neutron [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.110 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.508 187223 DEBUG nova.network.neutron [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Updating instance_info_cache with network_info: [{"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.526 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-dbbbbc94-b53b-46db-b612-cc535b34fecc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.545 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.546 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.546 187223 DEBUG oslo_concurrency.lockutils [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:26 np0005535656 nova_compute[187219]: 2025-11-25 19:09:26.553 187223 INFO nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 25 14:09:26 np0005535656 virtqemud[186765]: Domain id=11 name='instance-0000000f' uuid=dbbbbc94-b53b-46db-b612-cc535b34fecc is tainted: custom-monitor
Nov 25 14:09:27 np0005535656 nova_compute[187219]: 2025-11-25 19:09:27.562 187223 INFO nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 25 14:09:27 np0005535656 podman[214693]: 2025-11-25 19:09:27.972671033 +0000 UTC m=+0.087001268 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 14:09:28 np0005535656 nova_compute[187219]: 2025-11-25 19:09:28.536 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:28 np0005535656 nova_compute[187219]: 2025-11-25 19:09:28.567 187223 INFO nova.virt.libvirt.driver [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 25 14:09:28 np0005535656 nova_compute[187219]: 2025-11-25 19:09:28.571 187223 DEBUG nova.compute.manager [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:09:28 np0005535656 nova_compute[187219]: 2025-11-25 19:09:28.587 187223 DEBUG nova.objects.instance [None req-8522cb63-feb2-449c-8608-65007027df20 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 25 14:09:31 np0005535656 nova_compute[187219]: 2025-11-25 19:09:31.111 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:33 np0005535656 nova_compute[187219]: 2025-11-25 19:09:33.540 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.920 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.920 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.921 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.921 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.921 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.922 187223 INFO nova.compute.manager [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Terminating instance#033[00m
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.923 187223 DEBUG nova.compute.manager [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 14:09:34 np0005535656 kernel: tap4e91cc02-25 (unregistering): left promiscuous mode
Nov 25 14:09:34 np0005535656 NetworkManager[55548]: <info>  [1764097774.9579] device (tap4e91cc02-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:09:34 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:34Z|00121|binding|INFO|Releasing lport 4e91cc02-2569-488e-b88d-1a635ca9e1fa from this chassis (sb_readonly=0)
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.971 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:34 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:34Z|00122|binding|INFO|Setting lport 4e91cc02-2569-488e-b88d-1a635ca9e1fa down in Southbound
Nov 25 14:09:34 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:34Z|00123|binding|INFO|Removing iface tap4e91cc02-25 ovn-installed in OVS
Nov 25 14:09:34 np0005535656 nova_compute[187219]: 2025-11-25 19:09:34.976 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:34.983 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:2c:ab 10.100.0.12'], port_security=['fa:16:3e:4f:2c:ab 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '708a51ee-14d7-4511-ab36-5798d1c8de28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=4e91cc02-2569-488e-b88d-1a635ca9e1fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:09:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:34.985 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 4e91cc02-2569-488e-b88d-1a635ca9e1fa in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:09:34 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:34.987 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.000 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 25 14:09:35 np0005535656 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Consumed 14.109s CPU time.
Nov 25 14:09:35 np0005535656 systemd-machined[153481]: Machine qemu-10-instance-00000010 terminated.
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.018 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1323478d-dbc7-4780-bf74-b159a9065b8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.062 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[cb592395-4536-41e8-9538-30e2b84a1a76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.066 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[143ba504-a045-4d2f-9b00-f966c7c58625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.110 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4ed3d1-32a7-4b62-9441-651481930f7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.142 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[46b935f6-2c54-4cdb-94dd-691f491a9d21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468440, 'reachable_time': 31699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214723, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.173 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[0c773d89-8879-4342-b7ac-a9e67169ccb7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8e881e87-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468452, 'tstamp': 468452}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214728, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8e881e87-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468455, 'tstamp': 468455}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214728, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.175 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.178 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.185 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.186 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.186 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.186 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:35.187 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.212 187223 INFO nova.virt.libvirt.driver [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Instance destroyed successfully.#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.213 187223 DEBUG nova.objects.instance [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'resources' on Instance uuid 708a51ee-14d7-4511-ab36-5798d1c8de28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.231 187223 DEBUG nova.virt.libvirt.vif [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:08:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-343208568',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-343208568',id=16,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:08:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-44wciidh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:08:43Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=708a51ee-14d7-4511-ab36-5798d1c8de28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.232 187223 DEBUG nova.network.os_vif_util [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "address": "fa:16:3e:4f:2c:ab", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e91cc02-25", "ovs_interfaceid": "4e91cc02-2569-488e-b88d-1a635ca9e1fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.232 187223 DEBUG nova.network.os_vif_util [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.233 187223 DEBUG os_vif [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.234 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.234 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e91cc02-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.236 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.237 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.240 187223 INFO os_vif [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2c:ab,bridge_name='br-int',has_traffic_filtering=True,id=4e91cc02-2569-488e-b88d-1a635ca9e1fa,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e91cc02-25')#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.241 187223 INFO nova.virt.libvirt.driver [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Deleting instance files /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28_del#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.241 187223 INFO nova.virt.libvirt.driver [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Deletion of /var/lib/nova/instances/708a51ee-14d7-4511-ab36-5798d1c8de28_del complete#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.336 187223 INFO nova.compute.manager [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.337 187223 DEBUG oslo.service.loopingcall [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.337 187223 DEBUG nova.compute.manager [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 14:09:35 np0005535656 nova_compute[187219]: 2025-11-25 19:09:35.337 187223 DEBUG nova.network.neutron [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 14:09:35 np0005535656 podman[197580]: time="2025-11-25T19:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:09:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:09:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.114 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.692 187223 DEBUG nova.compute.manager [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-unplugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.692 187223 DEBUG oslo_concurrency.lockutils [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.692 187223 DEBUG oslo_concurrency.lockutils [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.692 187223 DEBUG oslo_concurrency.lockutils [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.693 187223 DEBUG nova.compute.manager [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] No waiting events found dispatching network-vif-unplugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.693 187223 DEBUG nova.compute.manager [req-9cfa4684-d472-4c21-aac7-862c669d8dc3 req-7a998b7f-fb07-4d69-86fa-d447da74ff10 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-unplugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:09:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:36.804 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.804 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:36.805 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.816 187223 DEBUG nova.network.neutron [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.831 187223 INFO nova.compute.manager [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Took 1.49 seconds to deallocate network for instance.#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.868 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.869 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.877 187223 DEBUG nova.compute.manager [req-0c830990-a8be-473b-bfaa-cb18d8f9565f req-3e894bc1-9282-4554-937b-7109a407994d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-deleted-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.929 187223 DEBUG nova.compute.provider_tree [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:09:36 np0005535656 nova_compute[187219]: 2025-11-25 19:09:36.949 187223 DEBUG nova.scheduler.client.report [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.072 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.097 187223 INFO nova.scheduler.client.report [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Deleted allocations for instance 708a51ee-14d7-4511-ab36-5798d1c8de28#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.163 187223 DEBUG oslo_concurrency.lockutils [None req-094c09f4-f74b-45ed-8020-586113e27dd3 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.941 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "dbbbbc94-b53b-46db-b612-cc535b34fecc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.942 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.942 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.943 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.943 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.945 187223 INFO nova.compute.manager [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Terminating instance#033[00m
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.947 187223 DEBUG nova.compute.manager [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 14:09:37 np0005535656 kernel: tap7fa3200a-d4 (unregistering): left promiscuous mode
Nov 25 14:09:37 np0005535656 NetworkManager[55548]: <info>  [1764097777.9762] device (tap7fa3200a-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.987 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:37 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:37Z|00124|binding|INFO|Releasing lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e from this chassis (sb_readonly=0)
Nov 25 14:09:37 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:37Z|00125|binding|INFO|Setting lport 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e down in Southbound
Nov 25 14:09:37 np0005535656 ovn_controller[95460]: 2025-11-25T19:09:37Z|00126|binding|INFO|Removing iface tap7fa3200a-d4 ovn-installed in OVS
Nov 25 14:09:37 np0005535656 nova_compute[187219]: 2025-11-25 19:09:37.989 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:37.995 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:8a:59 10.100.0.5'], port_security=['fa:16:3e:a0:8a:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbbbbc94-b53b-46db-b612-cc535b34fecc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '11', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:09:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:37.997 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 7fa3200a-d4a1-49a5-99cf-2d0b7c75720e in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.000 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.001 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[fae50f10-1604-4855-9497-8bf45687f352]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.002 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.008 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 25 14:09:38 np0005535656 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 3.171s CPU time.
Nov 25 14:09:38 np0005535656 systemd-machined[153481]: Machine qemu-11-instance-0000000f terminated.
Nov 25 14:09:38 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [NOTICE]   (214392) : haproxy version is 2.8.14-c23fe91
Nov 25 14:09:38 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [NOTICE]   (214392) : path to executable is /usr/sbin/haproxy
Nov 25 14:09:38 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [ALERT]    (214392) : Current worker (214394) exited with code 143 (Terminated)
Nov 25 14:09:38 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[214388]: [WARNING]  (214392) : All workers exited. Exiting... (0)
Nov 25 14:09:38 np0005535656 systemd[1]: libpod-5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa.scope: Deactivated successfully.
Nov 25 14:09:38 np0005535656 podman[214768]: 2025-11-25 19:09:38.163934804 +0000 UTC m=+0.049163911 container died 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 25 14:09:38 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa-userdata-shm.mount: Deactivated successfully.
Nov 25 14:09:38 np0005535656 systemd[1]: var-lib-containers-storage-overlay-bef66b16b8e480b12b279ce626f2e60d249aee0cd945d73d574c520dad8030f1-merged.mount: Deactivated successfully.
Nov 25 14:09:38 np0005535656 podman[214768]: 2025-11-25 19:09:38.206086856 +0000 UTC m=+0.091315953 container cleanup 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:09:38 np0005535656 systemd[1]: libpod-conmon-5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa.scope: Deactivated successfully.
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.217 187223 INFO nova.virt.libvirt.driver [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Instance destroyed successfully.#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.218 187223 DEBUG nova.objects.instance [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'resources' on Instance uuid dbbbbc94-b53b-46db-b612-cc535b34fecc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.236 187223 DEBUG nova.virt.libvirt.vif [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-25T19:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1456001434',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1456001434',id=15,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:08:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-wkid6hcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:09:28Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=dbbbbc94-b53b-46db-b612-cc535b34fecc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.237 187223 DEBUG nova.network.os_vif_util [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "address": "fa:16:3e:a0:8a:59", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7fa3200a-d4", "ovs_interfaceid": "7fa3200a-d4a1-49a5-99cf-2d0b7c75720e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.237 187223 DEBUG nova.network.os_vif_util [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.238 187223 DEBUG os_vif [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.239 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.239 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fa3200a-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.241 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.243 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.245 187223 INFO os_vif [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:8a:59,bridge_name='br-int',has_traffic_filtering=True,id=7fa3200a-d4a1-49a5-99cf-2d0b7c75720e,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7fa3200a-d4')#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.245 187223 INFO nova.virt.libvirt.driver [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Deleting instance files /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc_del#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.246 187223 INFO nova.virt.libvirt.driver [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Deletion of /var/lib/nova/instances/dbbbbc94-b53b-46db-b612-cc535b34fecc_del complete#033[00m
Nov 25 14:09:38 np0005535656 podman[214817]: 2025-11-25 19:09:38.266915159 +0000 UTC m=+0.037817726 container remove 5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.271 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2055c467-8d6d-4bc4-97d2-ca696b19ec2d]: (4, ('Tue Nov 25 07:09:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa)\n5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa\nTue Nov 25 07:09:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa)\n5275decc3caef6e861b1053d88c707ceb3526b05631be37f68d8444804b6f0fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.273 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8f2863-4bd8-4069-a36f-d2222601e658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.274 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.275 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.278 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.280 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[471b6b7b-3c9b-422e-8d61-4e525a6f77b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.293 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.302 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[16dd0a5c-6d10-48d5-b49b-34db6f2bba57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.303 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9ef7f1-df5d-4e01-974a-d0461b7aa437]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.315 187223 INFO nova.compute.manager [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.316 187223 DEBUG oslo.service.loopingcall [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.316 187223 DEBUG nova.compute.manager [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.316 187223 DEBUG nova.network.neutron [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.316 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e3745571-aeaa-4d63-a570-70a2d4f99e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468434, 'reachable_time': 16117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214832, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.319 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:09:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:38.319 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[8877351c-d6bd-4ce6-8362-3f3e2e491e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.796 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.797 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.798 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.798 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "708a51ee-14d7-4511-ab36-5798d1c8de28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.799 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] No waiting events found dispatching network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.799 187223 WARNING nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Received unexpected event network-vif-plugged-4e91cc02-2569-488e-b88d-1a635ca9e1fa for instance with vm_state deleted and task_state None.#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.799 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Received event network-vif-unplugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.800 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.800 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.801 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.801 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] No waiting events found dispatching network-vif-unplugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.801 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Received event network-vif-unplugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.802 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Received event network-vif-plugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.802 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.803 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.803 187223 DEBUG oslo_concurrency.lockutils [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.804 187223 DEBUG nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] No waiting events found dispatching network-vif-plugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.804 187223 WARNING nova.compute.manager [req-bcd9db27-d934-46ea-a6fb-97b8f732fcd7 req-4093a779-f016-40b1-9674-f13fda9504f4 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Received unexpected event network-vif-plugged-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e for instance with vm_state active and task_state deleting.#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.806 187223 DEBUG nova.network.neutron [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.824 187223 INFO nova.compute.manager [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Took 0.51 seconds to deallocate network for instance.#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.872 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.872 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.877 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.906 187223 INFO nova.scheduler.client.report [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Deleted allocations for instance dbbbbc94-b53b-46db-b612-cc535b34fecc#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.972 187223 DEBUG nova.compute.manager [req-79b9d59e-6c5a-42fa-a2d1-e7f373d2c115 req-b3f43ac5-f6de-402d-a54b-06322a9be676 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Received event network-vif-deleted-7fa3200a-d4a1-49a5-99cf-2d0b7c75720e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:09:38 np0005535656 nova_compute[187219]: 2025-11-25 19:09:38.989 187223 DEBUG oslo_concurrency.lockutils [None req-db07211a-9c8d-4bd5-8d39-eb2fe480dc67 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "dbbbbc94-b53b-46db-b612-cc535b34fecc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:09:39 np0005535656 podman[214833]: 2025-11-25 19:09:39.971805122 +0000 UTC m=+0.081769406 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:09:41 np0005535656 nova_compute[187219]: 2025-11-25 19:09:41.116 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:43 np0005535656 nova_compute[187219]: 2025-11-25 19:09:43.243 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:44.807 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:09:46 np0005535656 nova_compute[187219]: 2025-11-25 19:09:46.118 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:48 np0005535656 nova_compute[187219]: 2025-11-25 19:09:48.247 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:48 np0005535656 podman[214859]: 2025-11-25 19:09:48.940047738 +0000 UTC m=+0.057342359 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 14:09:48 np0005535656 podman[214858]: 2025-11-25 19:09:48.972275902 +0000 UTC m=+0.095020980 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:09:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:09:50 np0005535656 nova_compute[187219]: 2025-11-25 19:09:50.210 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097775.2096536, 708a51ee-14d7-4511-ab36-5798d1c8de28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:09:50 np0005535656 nova_compute[187219]: 2025-11-25 19:09:50.210 187223 INFO nova.compute.manager [-] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:09:50 np0005535656 nova_compute[187219]: 2025-11-25 19:09:50.314 187223 DEBUG nova.compute.manager [None req-265f82e7-034d-442a-86bf-7e4e28fa683a - - - - - -] [instance: 708a51ee-14d7-4511-ab36-5798d1c8de28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:09:51 np0005535656 nova_compute[187219]: 2025-11-25 19:09:51.119 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:53 np0005535656 nova_compute[187219]: 2025-11-25 19:09:53.217 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097778.2161946, dbbbbc94-b53b-46db-b612-cc535b34fecc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:09:53 np0005535656 nova_compute[187219]: 2025-11-25 19:09:53.217 187223 INFO nova.compute.manager [-] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:09:53 np0005535656 nova_compute[187219]: 2025-11-25 19:09:53.250 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:53 np0005535656 nova_compute[187219]: 2025-11-25 19:09:53.255 187223 DEBUG nova.compute.manager [None req-7b606f69-545c-4da0-9948-1b6dff52c7e4 - - - - - -] [instance: dbbbbc94-b53b-46db-b612-cc535b34fecc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:09:54 np0005535656 podman[214902]: 2025-11-25 19:09:54.929193949 +0000 UTC m=+0.055564161 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350)
Nov 25 14:09:56 np0005535656 nova_compute[187219]: 2025-11-25 19:09:56.122 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:58 np0005535656 nova_compute[187219]: 2025-11-25 19:09:58.253 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:09:58 np0005535656 nova_compute[187219]: 2025-11-25 19:09:58.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:58 np0005535656 nova_compute[187219]: 2025-11-25 19:09:58.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:09:58 np0005535656 nova_compute[187219]: 2025-11-25 19:09:58.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 14:09:58 np0005535656 podman[214925]: 2025-11-25 19:09:58.95633898 +0000 UTC m=+0.073762378 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 14:09:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:59.086 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:09:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:59.086 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:09:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:09:59.086 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:01 np0005535656 nova_compute[187219]: 2025-11-25 19:10:01.125 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:01 np0005535656 nova_compute[187219]: 2025-11-25 19:10:01.716 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:01 np0005535656 nova_compute[187219]: 2025-11-25 19:10:01.716 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:10:01 np0005535656 nova_compute[187219]: 2025-11-25 19:10:01.717 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:10:01 np0005535656 nova_compute[187219]: 2025-11-25 19:10:01.742 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:10:02 np0005535656 nova_compute[187219]: 2025-11-25 19:10:02.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:03 np0005535656 nova_compute[187219]: 2025-11-25 19:10:03.257 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:03 np0005535656 nova_compute[187219]: 2025-11-25 19:10:03.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:05 np0005535656 podman[197580]: time="2025-11-25T19:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:10:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:10:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 25 14:10:06 np0005535656 nova_compute[187219]: 2025-11-25 19:10:06.129 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:06 np0005535656 nova_compute[187219]: 2025-11-25 19:10:06.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:06 np0005535656 nova_compute[187219]: 2025-11-25 19:10:06.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:10:07 np0005535656 nova_compute[187219]: 2025-11-25 19:10:07.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:08 np0005535656 nova_compute[187219]: 2025-11-25 19:10:08.260 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:08 np0005535656 nova_compute[187219]: 2025-11-25 19:10:08.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:08 np0005535656 nova_compute[187219]: 2025-11-25 19:10:08.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:08 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:08Z|00127|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.686 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.710 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.710 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.711 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.711 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.939 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.941 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5871MB free_disk=73.16390991210938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.941 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:09 np0005535656 nova_compute[187219]: 2025-11-25 19:10:09.942 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.080 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.081 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.162 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.180 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.208 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.208 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.209 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.210 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 14:10:10 np0005535656 nova_compute[187219]: 2025-11-25 19:10:10.226 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 14:10:10 np0005535656 podman[214947]: 2025-11-25 19:10:10.979877948 +0000 UTC m=+0.084846417 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:10:11 np0005535656 nova_compute[187219]: 2025-11-25 19:10:11.131 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:12 np0005535656 nova_compute[187219]: 2025-11-25 19:10:12.212 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:10:13 np0005535656 nova_compute[187219]: 2025-11-25 19:10:13.300 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:16 np0005535656 nova_compute[187219]: 2025-11-25 19:10:16.133 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:18 np0005535656 nova_compute[187219]: 2025-11-25 19:10:18.303 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:10:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:10:19 np0005535656 podman[214972]: 2025-11-25 19:10:19.957862956 +0000 UTC m=+0.072339991 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 14:10:20 np0005535656 podman[214971]: 2025-11-25 19:10:20.006757537 +0000 UTC m=+0.116093364 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 14:10:21 np0005535656 nova_compute[187219]: 2025-11-25 19:10:21.135 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:23 np0005535656 nova_compute[187219]: 2025-11-25 19:10:23.306 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:25 np0005535656 podman[215014]: 2025-11-25 19:10:25.929325114 +0000 UTC m=+0.056286851 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 25 14:10:26 np0005535656 nova_compute[187219]: 2025-11-25 19:10:26.137 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:28 np0005535656 nova_compute[187219]: 2025-11-25 19:10:28.308 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:29 np0005535656 podman[215036]: 2025-11-25 19:10:29.94104372 +0000 UTC m=+0.064182233 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.139 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.261 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.262 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.295 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.398 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.398 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.407 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.408 187223 INFO nova.compute.claims [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.518 187223 DEBUG nova.compute.provider_tree [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.531 187223 DEBUG nova.scheduler.client.report [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.554 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.555 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.599 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.599 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.615 187223 INFO nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.636 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.740 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.741 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.742 187223 INFO nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Creating image(s)#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.742 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.742 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.743 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.753 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.813 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.814 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.815 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.826 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.879 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.880 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.916 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.916 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.917 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.966 187223 DEBUG nova.policy [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.969 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.969 187223 DEBUG nova.virt.disk.api [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:10:31 np0005535656 nova_compute[187219]: 2025-11-25 19:10:31.969 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.019 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.020 187223 DEBUG nova.virt.disk.api [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.020 187223 DEBUG nova.objects.instance [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.040 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.040 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Ensure instance console log exists: /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.040 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.041 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.041 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:32 np0005535656 nova_compute[187219]: 2025-11-25 19:10:32.547 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Successfully created port: 320b23d6-cf08-4975-803c-f98b80008661 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.200 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Successfully updated port: 320b23d6-cf08-4975-803c-f98b80008661 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.277 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.277 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.278 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.311 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.382 187223 DEBUG nova.compute.manager [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-changed-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.383 187223 DEBUG nova.compute.manager [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Refreshing instance network info cache due to event network-changed-320b23d6-cf08-4975-803c-f98b80008661. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.383 187223 DEBUG oslo_concurrency.lockutils [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:10:33 np0005535656 nova_compute[187219]: 2025-11-25 19:10:33.514 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:10:35 np0005535656 podman[197580]: time="2025-11-25T19:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:10:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:10:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.004 187223 DEBUG nova.network.neutron [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updating instance_info_cache with network_info: [{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.040 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.041 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Instance network_info: |[{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.041 187223 DEBUG oslo_concurrency.lockutils [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.041 187223 DEBUG nova.network.neutron [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Refreshing network info cache for port 320b23d6-cf08-4975-803c-f98b80008661 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.045 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Start _get_guest_xml network_info=[{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.049 187223 WARNING nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.054 187223 DEBUG nova.virt.libvirt.host [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.054 187223 DEBUG nova.virt.libvirt.host [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.057 187223 DEBUG nova.virt.libvirt.host [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.058 187223 DEBUG nova.virt.libvirt.host [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.059 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.059 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.060 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.060 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.060 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.060 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.061 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.061 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.061 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.062 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.062 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.062 187223 DEBUG nova.virt.hardware [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.066 187223 DEBUG nova.virt.libvirt.vif [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1636264059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1636264059',id=18,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-3qu9r21r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:10:31Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.066 187223 DEBUG nova.network.os_vif_util [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.067 187223 DEBUG nova.network.os_vif_util [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.068 187223 DEBUG nova.objects.instance [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.089 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <uuid>ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1</uuid>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <name>instance-00000012</name>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-1636264059</nova:name>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:10:36</nova:creationTime>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        <nova:port uuid="320b23d6-cf08-4975-803c-f98b80008661">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="serial">ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="uuid">ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.config"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:d8:ef:29"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <target dev="tap320b23d6-cf"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/console.log" append="off"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:10:36 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:10:36 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:10:36 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:10:36 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.091 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Preparing to wait for external event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.091 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.091 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.091 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.092 187223 DEBUG nova.virt.libvirt.vif [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1636264059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1636264059',id=18,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-3qu9r21r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:10:31Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.092 187223 DEBUG nova.network.os_vif_util [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.093 187223 DEBUG nova.network.os_vif_util [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.093 187223 DEBUG os_vif [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.094 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.094 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.094 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.096 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.096 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap320b23d6-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.097 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap320b23d6-cf, col_values=(('external_ids', {'iface-id': '320b23d6-cf08-4975-803c-f98b80008661', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:ef:29', 'vm-uuid': 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.098 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 NetworkManager[55548]: <info>  [1764097836.0997] manager: (tap320b23d6-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.104 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.105 187223 INFO os_vif [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf')#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.140 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.150 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.150 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.150 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:d8:ef:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.151 187223 INFO nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Using config drive#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.682 187223 INFO nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Creating config drive at /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.config#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.688 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkza5iiyh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.830 187223 DEBUG oslo_concurrency.processutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkza5iiyh" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:10:36 np0005535656 kernel: tap320b23d6-cf: entered promiscuous mode
Nov 25 14:10:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:36Z|00128|binding|INFO|Claiming lport 320b23d6-cf08-4975-803c-f98b80008661 for this chassis.
Nov 25 14:10:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:36Z|00129|binding|INFO|320b23d6-cf08-4975-803c-f98b80008661: Claiming fa:16:3e:d8:ef:29 10.100.0.12
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.901 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 NetworkManager[55548]: <info>  [1764097836.9036] manager: (tap320b23d6-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.909 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:ef:29 10.100.0.12'], port_security=['fa:16:3e:d8:ef:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=320b23d6-cf08-4975-803c-f98b80008661) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.910 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 320b23d6-cf08-4975-803c-f98b80008661 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.911 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.915 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:36Z|00130|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 ovn-installed in OVS
Nov 25 14:10:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:36Z|00131|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 up in Southbound
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.918 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 nova_compute[187219]: 2025-11-25 19:10:36.922 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.930 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1851c18c-db3b-42de-bee3-e1cafba3fa3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.931 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.933 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.933 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e05bb3e0-e2d6-4abc-a45d-9ea8ac4a4d45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.934 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[ef454264-959e-40ff-9c7d-450475418bc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:36 np0005535656 systemd-udevd[215090]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:10:36 np0005535656 systemd-machined[153481]: New machine qemu-12-instance-00000012.
Nov 25 14:10:36 np0005535656 NetworkManager[55548]: <info>  [1764097836.9504] device (tap320b23d6-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:10:36 np0005535656 NetworkManager[55548]: <info>  [1764097836.9517] device (tap320b23d6-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.953 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[29231908-71a4-4dee-b8c6-d176c74eb164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:36 np0005535656 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Nov 25 14:10:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:36.975 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4c68e2fe-e3e5-4597-9e81-963a8f0aaa3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.005 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4c685b-364e-478b-bbcf-26d6edb3608f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.011 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d77bf0-8973-470d-af24-8bc0ef43442f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 NetworkManager[55548]: <info>  [1764097837.0123] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Nov 25 14:10:37 np0005535656 systemd-udevd[215094]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.039 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a2ade4-f73f-475c-9fd0-5ac07ba7acb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.043 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[4c50fc4f-07f2-4791-80b2-12f075980588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 NetworkManager[55548]: <info>  [1764097837.0604] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.064 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[c346557b-2067-447e-be16-e10685c33493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.077 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f2750625-06b8-4315-bef8-1deebb016f1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479858, 'reachable_time': 30503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215123, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.088 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[77baa6a7-47a3-4662-9514-692743838f7f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479858, 'tstamp': 479858}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215124, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.101 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[22950fd8-4b3b-4fe9-b76f-81c4527df58d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479858, 'reachable_time': 30503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215125, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.135 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4be067e4-24ed-4ceb-9f80-a0e7b99f9950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.199 187223 DEBUG nova.compute.manager [req-f3c5e178-1178-426c-b5b1-752c0be8517e req-c1d025a2-ed3b-4f1f-83b9-8be285f8aeb7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.200 187223 DEBUG oslo_concurrency.lockutils [req-f3c5e178-1178-426c-b5b1-752c0be8517e req-c1d025a2-ed3b-4f1f-83b9-8be285f8aeb7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.201 187223 DEBUG oslo_concurrency.lockutils [req-f3c5e178-1178-426c-b5b1-752c0be8517e req-c1d025a2-ed3b-4f1f-83b9-8be285f8aeb7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.201 187223 DEBUG oslo_concurrency.lockutils [req-f3c5e178-1178-426c-b5b1-752c0be8517e req-c1d025a2-ed3b-4f1f-83b9-8be285f8aeb7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.202 187223 DEBUG nova.compute.manager [req-f3c5e178-1178-426c-b5b1-752c0be8517e req-c1d025a2-ed3b-4f1f-83b9-8be285f8aeb7 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Processing event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.201 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[37a0095c-1aa3-4e29-a58a-5e91b0106b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.203 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.203 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.204 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:37 np0005535656 NetworkManager[55548]: <info>  [1764097837.2060] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 25 14:10:37 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.207 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.208 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:37 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:37Z|00132|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:10:37 np0005535656 nova_compute[187219]: 2025-11-25 19:10:37.220 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.221 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.222 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[490cb15e-a495-4e7e-9d8a-9e23664325d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.222 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:10:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:37.223 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:10:37 np0005535656 podman[215157]: 2025-11-25 19:10:37.555076334 +0000 UTC m=+0.049026876 container create 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:10:37 np0005535656 systemd[1]: Started libpod-conmon-77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386.scope.
Nov 25 14:10:37 np0005535656 podman[215157]: 2025-11-25 19:10:37.52658485 +0000 UTC m=+0.020535382 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:10:37 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:10:37 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b89b083622be9ca9f9766926569c8ab34e134285db0f6035e0ec50e7e556b33/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:10:37 np0005535656 podman[215157]: 2025-11-25 19:10:37.643149636 +0000 UTC m=+0.137100158 container init 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:10:37 np0005535656 podman[215157]: 2025-11-25 19:10:37.647934414 +0000 UTC m=+0.141884916 container start 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 25 14:10:37 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [NOTICE]   (215176) : New worker (215178) forked
Nov 25 14:10:37 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [NOTICE]   (215176) : Loading success.
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.011 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:38.013 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:10:38 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:38.014 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.211 187223 DEBUG nova.network.neutron [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updated VIF entry in instance network info cache for port 320b23d6-cf08-4975-803c-f98b80008661. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.212 187223 DEBUG nova.network.neutron [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updating instance_info_cache with network_info: [{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.233 187223 DEBUG oslo_concurrency.lockutils [req-ea39cc53-1720-4572-8443-03c2d9320d9e req-ab9dccdf-6581-49eb-b7cd-57598f47bac6 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.572 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.573 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097838.5718124, ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.573 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] VM Started (Lifecycle Event)#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.577 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.580 187223 INFO nova.virt.libvirt.driver [-] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Instance spawned successfully.#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.580 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.611 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.614 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.614 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.614 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.615 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.615 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.615 187223 DEBUG nova.virt.libvirt.driver [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.619 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.676 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.677 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097838.5728242, ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.677 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.697 187223 INFO nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Took 6.96 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.697 187223 DEBUG nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.699 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.706 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097838.575234, ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.707 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.745 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.748 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.770 187223 INFO nova.compute.manager [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Took 7.40 seconds to build instance.#033[00m
Nov 25 14:10:38 np0005535656 nova_compute[187219]: 2025-11-25 19:10:38.802 187223 DEBUG oslo_concurrency.lockutils [None req-f6b07565-92f7-4ee6-a020-390923eaef6b e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.314 187223 DEBUG nova.compute.manager [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.315 187223 DEBUG oslo_concurrency.lockutils [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.316 187223 DEBUG oslo_concurrency.lockutils [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.316 187223 DEBUG oslo_concurrency.lockutils [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.317 187223 DEBUG nova.compute.manager [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:10:39 np0005535656 nova_compute[187219]: 2025-11-25 19:10:39.317 187223 WARNING nova.compute.manager [req-4bfe0a26-402d-4242-9168-573324c7d064 req-d17bfdad-b633-4b44-9e04-b2044b293ccb 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state None.#033[00m
Nov 25 14:10:41 np0005535656 nova_compute[187219]: 2025-11-25 19:10:41.100 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:41 np0005535656 nova_compute[187219]: 2025-11-25 19:10:41.143 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:41 np0005535656 podman[215194]: 2025-11-25 19:10:41.968903233 +0000 UTC m=+0.076878052 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 14:10:46 np0005535656 nova_compute[187219]: 2025-11-25 19:10:46.104 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:46 np0005535656 nova_compute[187219]: 2025-11-25 19:10:46.145 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:48 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:48.017 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:10:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:10:50 np0005535656 podman[215221]: 2025-11-25 19:10:50.981945082 +0000 UTC m=+0.093623442 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:10:51 np0005535656 podman[215220]: 2025-11-25 19:10:51.022099968 +0000 UTC m=+0.138675620 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:10:51 np0005535656 nova_compute[187219]: 2025-11-25 19:10:51.107 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:51 np0005535656 nova_compute[187219]: 2025-11-25 19:10:51.148 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:53 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:53Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:ef:29 10.100.0.12
Nov 25 14:10:53 np0005535656 ovn_controller[95460]: 2025-11-25T19:10:53Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:ef:29 10.100.0.12
Nov 25 14:10:56 np0005535656 nova_compute[187219]: 2025-11-25 19:10:56.111 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:56 np0005535656 nova_compute[187219]: 2025-11-25 19:10:56.149 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:10:56 np0005535656 podman[215278]: 2025-11-25 19:10:56.949891565 +0000 UTC m=+0.066475223 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=)
Nov 25 14:10:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:59.087 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:10:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:59.088 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:10:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:10:59.089 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:00 np0005535656 nova_compute[187219]: 2025-11-25 19:11:00.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:00 np0005535656 podman[215301]: 2025-11-25 19:11:00.984242089 +0000 UTC m=+0.098051640 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd)
Nov 25 14:11:01 np0005535656 nova_compute[187219]: 2025-11-25 19:11:01.113 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:01 np0005535656 nova_compute[187219]: 2025-11-25 19:11:01.152 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.984 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.984 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.985 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:11:02 np0005535656 nova_compute[187219]: 2025-11-25 19:11:02.985 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:11:05 np0005535656 podman[197580]: time="2025-11-25T19:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:11:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:11:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Nov 25 14:11:05 np0005535656 nova_compute[187219]: 2025-11-25 19:11:05.994 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updating instance_info_cache with network_info: [{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:11:06 np0005535656 nova_compute[187219]: 2025-11-25 19:11:06.009 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:11:06 np0005535656 nova_compute[187219]: 2025-11-25 19:11:06.010 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:11:06 np0005535656 nova_compute[187219]: 2025-11-25 19:11:06.010 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:06 np0005535656 nova_compute[187219]: 2025-11-25 19:11:06.119 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:06 np0005535656 nova_compute[187219]: 2025-11-25 19:11:06.156 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:06 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:06Z|00133|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 25 14:11:08 np0005535656 nova_compute[187219]: 2025-11-25 19:11:08.006 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:08 np0005535656 nova_compute[187219]: 2025-11-25 19:11:08.006 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:08 np0005535656 nova_compute[187219]: 2025-11-25 19:11:08.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:08 np0005535656 nova_compute[187219]: 2025-11-25 19:11:08.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:11:09 np0005535656 nova_compute[187219]: 2025-11-25 19:11:09.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.701 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.701 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.701 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.775 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.831 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.832 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:11:10 np0005535656 nova_compute[187219]: 2025-11-25 19:11:10.884 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.039 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.041 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.13497924804688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.041 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.041 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.121 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.122 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.122 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.124 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.160 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.277 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.302 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.331 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:11:11 np0005535656 nova_compute[187219]: 2025-11-25 19:11:11.331 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:12 np0005535656 nova_compute[187219]: 2025-11-25 19:11:12.332 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:11:12 np0005535656 podman[215330]: 2025-11-25 19:11:12.929553588 +0000 UTC m=+0.049603051 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:11:16 np0005535656 nova_compute[187219]: 2025-11-25 19:11:16.128 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:16 np0005535656 nova_compute[187219]: 2025-11-25 19:11:16.161 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:11:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:11:21 np0005535656 nova_compute[187219]: 2025-11-25 19:11:21.131 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:21 np0005535656 nova_compute[187219]: 2025-11-25 19:11:21.163 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:21 np0005535656 podman[215356]: 2025-11-25 19:11:21.98901687 +0000 UTC m=+0.088382630 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 25 14:11:22 np0005535656 podman[215355]: 2025-11-25 19:11:22.020390712 +0000 UTC m=+0.124923070 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 25 14:11:23 np0005535656 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 14:11:26 np0005535656 nova_compute[187219]: 2025-11-25 19:11:26.135 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:26 np0005535656 nova_compute[187219]: 2025-11-25 19:11:26.166 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:26 np0005535656 nova_compute[187219]: 2025-11-25 19:11:26.350 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Check if temp file /var/lib/nova/instances/tmp35r7uc02 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:11:26 np0005535656 nova_compute[187219]: 2025-11-25 19:11:26.351 187223 DEBUG nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp35r7uc02',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:11:27 np0005535656 nova_compute[187219]: 2025-11-25 19:11:27.584 187223 DEBUG oslo_concurrency.processutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:11:27 np0005535656 nova_compute[187219]: 2025-11-25 19:11:27.672 187223 DEBUG oslo_concurrency.processutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:11:27 np0005535656 nova_compute[187219]: 2025-11-25 19:11:27.674 187223 DEBUG oslo_concurrency.processutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:11:27 np0005535656 nova_compute[187219]: 2025-11-25 19:11:27.771 187223 DEBUG oslo_concurrency.processutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:11:27 np0005535656 podman[215404]: 2025-11-25 19:11:27.998256751 +0000 UTC m=+0.103636350 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Nov 25 14:11:30 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:11:30 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:11:30 np0005535656 systemd-logind[788]: New session 36 of user nova.
Nov 25 14:11:30 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:11:30 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:11:30 np0005535656 systemd[215431]: Queued start job for default target Main User Target.
Nov 25 14:11:30 np0005535656 systemd[215431]: Created slice User Application Slice.
Nov 25 14:11:30 np0005535656 systemd[215431]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:11:30 np0005535656 systemd[215431]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:11:30 np0005535656 systemd[215431]: Reached target Paths.
Nov 25 14:11:30 np0005535656 systemd[215431]: Reached target Timers.
Nov 25 14:11:30 np0005535656 systemd[215431]: Starting D-Bus User Message Bus Socket...
Nov 25 14:11:30 np0005535656 systemd[215431]: Starting Create User's Volatile Files and Directories...
Nov 25 14:11:30 np0005535656 systemd[215431]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:11:30 np0005535656 systemd[215431]: Reached target Sockets.
Nov 25 14:11:30 np0005535656 systemd[215431]: Finished Create User's Volatile Files and Directories.
Nov 25 14:11:30 np0005535656 systemd[215431]: Reached target Basic System.
Nov 25 14:11:30 np0005535656 systemd[215431]: Reached target Main User Target.
Nov 25 14:11:30 np0005535656 systemd[215431]: Startup finished in 161ms.
Nov 25 14:11:30 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:11:30 np0005535656 systemd[1]: Started Session 36 of User nova.
Nov 25 14:11:30 np0005535656 systemd[1]: session-36.scope: Deactivated successfully.
Nov 25 14:11:30 np0005535656 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Nov 25 14:11:30 np0005535656 systemd-logind[788]: Removed session 36.
Nov 25 14:11:31 np0005535656 nova_compute[187219]: 2025-11-25 19:11:31.138 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:31 np0005535656 nova_compute[187219]: 2025-11-25 19:11:31.168 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:31 np0005535656 podman[215449]: 2025-11-25 19:11:31.981155065 +0000 UTC m=+0.090559130 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.327 187223 DEBUG nova.compute.manager [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.328 187223 DEBUG oslo_concurrency.lockutils [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.328 187223 DEBUG oslo_concurrency.lockutils [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.328 187223 DEBUG oslo_concurrency.lockutils [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.328 187223 DEBUG nova.compute.manager [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:32 np0005535656 nova_compute[187219]: 2025-11-25 19:11:32.328 187223 DEBUG nova.compute.manager [req-f4880299-be9f-486b-907c-2c7c543bf9f9 req-e6852195-50db-4334-a5e1-f794d2901f2b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.101 187223 INFO nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Took 6.33 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.101 187223 DEBUG nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.125 187223 DEBUG nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp35r7uc02',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(73d0cd1c-e84a-417c-bd8a-5391cef85cf0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.156 187223 DEBUG nova.objects.instance [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.158 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.161 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.161 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.182 187223 DEBUG nova.virt.libvirt.vif [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1636264059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1636264059',id=18,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:10:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-3qu9r21r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:10:38Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.182 187223 DEBUG nova.network.os_vif_util [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.183 187223 DEBUG nova.network.os_vif_util [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.184 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:11:34 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:d8:ef:29"/>
Nov 25 14:11:34 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:11:34 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:11:34 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:11:34 np0005535656 nova_compute[187219]:  <target dev="tap320b23d6-cf"/>
Nov 25 14:11:34 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:11:34 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.185 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.457 187223 DEBUG nova.compute.manager [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.458 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.459 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.460 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.461 187223 DEBUG nova.compute.manager [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.462 187223 WARNING nova.compute.manager [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.463 187223 DEBUG nova.compute.manager [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-changed-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.464 187223 DEBUG nova.compute.manager [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Refreshing instance network info cache due to event network-changed-320b23d6-cf08-4975-803c-f98b80008661. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.464 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.465 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.466 187223 DEBUG nova.network.neutron [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Refreshing network info cache for port 320b23d6-cf08-4975-803c-f98b80008661 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.664 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.665 187223 INFO nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:11:34 np0005535656 nova_compute[187219]: 2025-11-25 19:11:34.760 187223 INFO nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.265 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.266 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:11:35 np0005535656 podman[197580]: time="2025-11-25T19:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:11:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:11:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.770 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.770 187223 DEBUG nova.virt.libvirt.migration [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.797 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097895.797331, ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.798 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.816 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.820 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.857 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:11:35 np0005535656 kernel: tap320b23d6-cf (unregistering): left promiscuous mode
Nov 25 14:11:35 np0005535656 NetworkManager[55548]: <info>  [1764097895.9451] device (tap320b23d6-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:11:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:35Z|00134|binding|INFO|Releasing lport 320b23d6-cf08-4975-803c-f98b80008661 from this chassis (sb_readonly=0)
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.951 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:35Z|00135|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 down in Southbound
Nov 25 14:11:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:35Z|00136|binding|INFO|Removing iface tap320b23d6-cf ovn-installed in OVS
Nov 25 14:11:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:35.960 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:ef:29 10.100.0.12'], port_security=['fa:16:3e:d8:ef:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=320b23d6-cf08-4975-803c-f98b80008661) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:11:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:35.963 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 320b23d6-cf08-4975-803c-f98b80008661 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:11:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:35.966 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:11:35 np0005535656 nova_compute[187219]: 2025-11-25 19:11:35.968 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:35.969 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f77305c0-2e4e-41d0-b5d0-27f4e7a017a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:35.970 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:11:36 np0005535656 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Nov 25 14:11:36 np0005535656 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 17.335s CPU time.
Nov 25 14:11:36 np0005535656 systemd-machined[153481]: Machine qemu-12-instance-00000012 terminated.
Nov 25 14:11:36 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [NOTICE]   (215176) : haproxy version is 2.8.14-c23fe91
Nov 25 14:11:36 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [NOTICE]   (215176) : path to executable is /usr/sbin/haproxy
Nov 25 14:11:36 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [WARNING]  (215176) : Exiting Master process...
Nov 25 14:11:36 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [ALERT]    (215176) : Current worker (215178) exited with code 143 (Terminated)
Nov 25 14:11:36 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215172]: [WARNING]  (215176) : All workers exited. Exiting... (0)
Nov 25 14:11:36 np0005535656 systemd[1]: libpod-77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386.scope: Deactivated successfully.
Nov 25 14:11:36 np0005535656 podman[215510]: 2025-11-25 19:11:36.139341258 +0000 UTC m=+0.063671639 container died 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.142 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 kernel: tap320b23d6-cf: entered promiscuous mode
Nov 25 14:11:36 np0005535656 NetworkManager[55548]: <info>  [1764097896.1535] manager: (tap320b23d6-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 25 14:11:36 np0005535656 systemd-udevd[215490]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00137|binding|INFO|Claiming lport 320b23d6-cf08-4975-803c-f98b80008661 for this chassis.
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00138|binding|INFO|320b23d6-cf08-4975-803c-f98b80008661: Claiming fa:16:3e:d8:ef:29 10.100.0.12
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.154 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 kernel: tap320b23d6-cf (unregistering): left promiscuous mode
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.165 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:ef:29 10.100.0.12'], port_security=['fa:16:3e:d8:ef:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=320b23d6-cf08-4975-803c-f98b80008661) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:11:36 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386-userdata-shm.mount: Deactivated successfully.
Nov 25 14:11:36 np0005535656 systemd[1]: var-lib-containers-storage-overlay-6b89b083622be9ca9f9766926569c8ab34e134285db0f6035e0ec50e7e556b33-merged.mount: Deactivated successfully.
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00139|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 ovn-installed in OVS
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00140|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 up in Southbound
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00141|binding|INFO|Releasing lport 320b23d6-cf08-4975-803c-f98b80008661 from this chassis (sb_readonly=1)
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00142|if_status|INFO|Not setting lport 320b23d6-cf08-4975-803c-f98b80008661 down as sb is readonly
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.184 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.186 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00143|binding|INFO|Removing iface tap320b23d6-cf ovn-installed in OVS
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.187 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 podman[215510]: 2025-11-25 19:11:36.189778191 +0000 UTC m=+0.114108582 container cleanup 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00144|binding|INFO|Releasing lport 320b23d6-cf08-4975-803c-f98b80008661 from this chassis (sb_readonly=0)
Nov 25 14:11:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:11:36Z|00145|binding|INFO|Setting lport 320b23d6-cf08-4975-803c-f98b80008661 down in Southbound
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.199 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:ef:29 10.100.0.12'], port_security=['fa:16:3e:d8:ef:29 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=320b23d6-cf08-4975-803c-f98b80008661) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.199 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 systemd[1]: libpod-conmon-77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386.scope: Deactivated successfully.
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.221 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.221 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.222 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.275 187223 DEBUG nova.virt.libvirt.guest [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1' (instance-00000012) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.276 187223 INFO nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migration operation has completed#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.276 187223 INFO nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] _post_live_migration() is started..#033[00m
Nov 25 14:11:36 np0005535656 podman[215547]: 2025-11-25 19:11:36.28000945 +0000 UTC m=+0.056001063 container remove 77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.290 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[79a19cbb-0971-46ad-8f24-4a9848ce2b47]: (4, ('Tue Nov 25 07:11:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386)\n77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386\nTue Nov 25 07:11:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386)\n77cf773b8d116b2baad445ce494e052d863b060b3e806d3db840a34827ec6386\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.292 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d4838d-80f6-4ea2-b698-4522da73a859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.293 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.295 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.312 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.316 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b08f3fbf-9b76-4198-aef1-8dc3b727d381]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.336 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4c03a34a-efa7-4658-8e94-74f971c992f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.337 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c3130c96-4cd9-4b0e-af9b-734dd843f983]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.360 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5f18a9-27c2-445e-878f-1e6ac7efc839]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479852, 'reachable_time': 43850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215565, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.365 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.365 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b889ff-55d0-4416-a687-3063af167564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.366 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 320b23d6-cf08-4975-803c-f98b80008661 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.367 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.367 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[027e339b-1e20-488b-abda-2e77d8a595be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.368 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 320b23d6-cf08-4975-803c-f98b80008661 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.369 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:11:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:36.369 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[108a0d46-88b8-479c-9229-b1e89491c0cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.467 187223 DEBUG nova.network.neutron [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updated VIF entry in instance network info cache for port 320b23d6-cf08-4975-803c-f98b80008661. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.468 187223 DEBUG nova.network.neutron [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Updating instance_info_cache with network_info: [{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.484 187223 DEBUG oslo_concurrency.lockutils [req-cb195485-bc0d-42ad-b704-9c68d738709d req-4027c30a-b6aa-4561-bdff-bb9cc7ffd6d1 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.614 187223 DEBUG nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.614 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.614 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.614 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG oslo_concurrency.lockutils [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.615 187223 DEBUG nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:36 np0005535656 nova_compute[187219]: 2025-11-25 19:11:36.616 187223 WARNING nova.compute.manager [req-70b35b9d-183a-4dea-907e-a6779f8ab1b6 req-6e5d75ea-c867-4b6a-996b-6aa917207d1d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.211 187223 DEBUG nova.compute.manager [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.212 187223 DEBUG oslo_concurrency.lockutils [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.212 187223 DEBUG oslo_concurrency.lockutils [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.212 187223 DEBUG oslo_concurrency.lockutils [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.213 187223 DEBUG nova.compute.manager [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.213 187223 DEBUG nova.compute.manager [req-eb414ed7-9bc0-473f-8471-e3887e5cc3df req-b74a5255-8e0d-47f8-a13e-2ca1b5f9307c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.425 187223 DEBUG nova.network.neutron [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 320b23d6-cf08-4975-803c-f98b80008661 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.426 187223 DEBUG nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.427 187223 DEBUG nova.virt.libvirt.vif [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:10:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1636264059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1636264059',id=18,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:10:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-3qu9r21r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:11:23Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.427 187223 DEBUG nova.network.os_vif_util [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "320b23d6-cf08-4975-803c-f98b80008661", "address": "fa:16:3e:d8:ef:29", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap320b23d6-cf", "ovs_interfaceid": "320b23d6-cf08-4975-803c-f98b80008661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.428 187223 DEBUG nova.network.os_vif_util [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:11:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:37.428 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.429 187223 DEBUG os_vif [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:11:37 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:37.430 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.431 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.432 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap320b23d6-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.433 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.435 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.440 187223 INFO os_vif [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:ef:29,bridge_name='br-int',has_traffic_filtering=True,id=320b23d6-cf08-4975-803c-f98b80008661,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap320b23d6-cf')#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.440 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.441 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.441 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.441 187223 DEBUG nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.442 187223 INFO nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Deleting instance files /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1_del#033[00m
Nov 25 14:11:37 np0005535656 nova_compute[187219]: 2025-11-25 19:11:37.443 187223 INFO nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Deletion of /var/lib/nova/instances/ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1_del complete#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.800 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.800 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.801 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.801 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.801 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.801 187223 WARNING nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.802 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.802 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.802 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.802 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.802 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.803 187223 WARNING nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.803 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.803 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.804 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.804 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.804 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.804 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-unplugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.804 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.805 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.805 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.805 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.805 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.805 187223 WARNING nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.806 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.806 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.806 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.806 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.807 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.807 187223 WARNING nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.807 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.807 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.808 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.808 187223 DEBUG oslo_concurrency.lockutils [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.808 187223 DEBUG nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] No waiting events found dispatching network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:11:38 np0005535656 nova_compute[187219]: 2025-11-25 19:11:38.808 187223 WARNING nova.compute.manager [req-8a3ae5e8-e001-4f49-8bc8-007d8c936d50 req-ba088929-44d7-48a8-a6a1-c2b4211a8015 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Received unexpected event network-vif-plugged-320b23d6-cf08-4975-803c-f98b80008661 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:11:40 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:11:40 np0005535656 systemd[215431]: Activating special unit Exit the Session...
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped target Main User Target.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped target Basic System.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped target Paths.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped target Sockets.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped target Timers.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:11:40 np0005535656 systemd[215431]: Closed D-Bus User Message Bus Socket.
Nov 25 14:11:40 np0005535656 systemd[215431]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:11:40 np0005535656 systemd[215431]: Removed slice User Application Slice.
Nov 25 14:11:40 np0005535656 systemd[215431]: Reached target Shutdown.
Nov 25 14:11:40 np0005535656 systemd[215431]: Finished Exit the Session.
Nov 25 14:11:40 np0005535656 systemd[215431]: Reached target Exit the Session.
Nov 25 14:11:40 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:11:40 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:11:40 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:11:40 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:11:40 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:11:40 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:11:40 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:11:41 np0005535656 nova_compute[187219]: 2025-11-25 19:11:41.188 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:42 np0005535656 nova_compute[187219]: 2025-11-25 19:11:42.435 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.162 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.162 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.163 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.189 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.190 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.191 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.191 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:11:43 np0005535656 podman[215569]: 2025-11-25 19:11:43.305220455 +0000 UTC m=+0.062033885 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.363 187223 WARNING nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.364 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5860MB free_disk=73.16375732421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.364 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.364 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.424 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.445 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.596 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration 73d0cd1c-e84a-417c-bd8a-5391cef85cf0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.597 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.597 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.676 187223 DEBUG nova.compute.provider_tree [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.711 187223 DEBUG nova.scheduler.client.report [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.739 187223 DEBUG nova.compute.resource_tracker [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.740 187223 DEBUG oslo_concurrency.lockutils [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.744 187223 INFO nova.compute.manager [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.841 187223 INFO nova.scheduler.client.report [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 73d0cd1c-e84a-417c-bd8a-5391cef85cf0#033[00m
Nov 25 14:11:43 np0005535656 nova_compute[187219]: 2025-11-25 19:11:43.842 187223 DEBUG nova.virt.libvirt.driver [None req-1230a28c-1fd5-41a4-8583-54f332434bc4 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:11:44 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:44.432 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:11:46 np0005535656 nova_compute[187219]: 2025-11-25 19:11:46.189 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:47 np0005535656 nova_compute[187219]: 2025-11-25 19:11:47.437 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:11:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:11:51 np0005535656 nova_compute[187219]: 2025-11-25 19:11:51.191 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:51 np0005535656 nova_compute[187219]: 2025-11-25 19:11:51.219 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764097896.217962, ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:11:51 np0005535656 nova_compute[187219]: 2025-11-25 19:11:51.219 187223 INFO nova.compute.manager [-] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:11:51 np0005535656 nova_compute[187219]: 2025-11-25 19:11:51.275 187223 DEBUG nova.compute.manager [None req-42462524-fb35-4e8b-8fb0-0b3bd90ed796 - - - - - -] [instance: ffbab99f-50e8-4a4f-a6b2-eb5614adf0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:11:52 np0005535656 nova_compute[187219]: 2025-11-25 19:11:52.439 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:52 np0005535656 podman[215594]: 2025-11-25 19:11:52.957234498 +0000 UTC m=+0.064138240 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 14:11:53 np0005535656 podman[215593]: 2025-11-25 19:11:53.017056053 +0000 UTC m=+0.132157725 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 14:11:56 np0005535656 nova_compute[187219]: 2025-11-25 19:11:56.194 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:57 np0005535656 nova_compute[187219]: 2025-11-25 19:11:57.441 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:11:58 np0005535656 podman[215636]: 2025-11-25 19:11:58.968779821 +0000 UTC m=+0.077727667 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 25 14:11:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:59.088 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:11:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:59.089 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:11:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:11:59.089 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:01 np0005535656 nova_compute[187219]: 2025-11-25 19:12:01.194 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:02 np0005535656 nova_compute[187219]: 2025-11-25 19:12:02.442 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:02 np0005535656 nova_compute[187219]: 2025-11-25 19:12:02.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:02 np0005535656 podman[215658]: 2025-11-25 19:12:02.979707855 +0000 UTC m=+0.090597850 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:12:03 np0005535656 nova_compute[187219]: 2025-11-25 19:12:03.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:03 np0005535656 nova_compute[187219]: 2025-11-25 19:12:03.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:12:03 np0005535656 nova_compute[187219]: 2025-11-25 19:12:03.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:12:03 np0005535656 nova_compute[187219]: 2025-11-25 19:12:03.753 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:12:04 np0005535656 nova_compute[187219]: 2025-11-25 19:12:04.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:04 np0005535656 nova_compute[187219]: 2025-11-25 19:12:04.827 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:04 np0005535656 nova_compute[187219]: 2025-11-25 19:12:04.827 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:04 np0005535656 nova_compute[187219]: 2025-11-25 19:12:04.955 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.212 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.213 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.224 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.224 187223 INFO nova.compute.claims [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.391 187223 DEBUG nova.compute.provider_tree [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.447 187223 DEBUG nova.scheduler.client.report [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.593 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.594 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:12:05 np0005535656 podman[197580]: time="2025-11-25T19:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:12:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:12:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.726 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.727 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.828 187223 INFO nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:12:05 np0005535656 nova_compute[187219]: 2025-11-25 19:12:05.944 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.114 187223 DEBUG nova.policy [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.196 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.199 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.199 187223 INFO nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Creating image(s)#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.201 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.201 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.202 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.227 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.230 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.321 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.323 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.324 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.350 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.423 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.425 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.592 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk 1073741824" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.594 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.595 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.687 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.689 187223 DEBUG nova.virt.disk.api [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.689 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.782 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.783 187223 DEBUG nova.virt.disk.api [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.783 187223 DEBUG nova.objects.instance [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid 4dd056a9-18fa-4489-98e2-0363712f619a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.838 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.838 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Ensure instance console log exists: /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.839 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.839 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:06 np0005535656 nova_compute[187219]: 2025-11-25 19:12:06.839 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:07 np0005535656 nova_compute[187219]: 2025-11-25 19:12:07.444 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:07 np0005535656 nova_compute[187219]: 2025-11-25 19:12:07.518 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Successfully created port: 9879f9ed-a8de-478e-b768-7d8bfcc3491f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.061 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Successfully updated port: 9879f9ed-a8de-478e-b768-7d8bfcc3491f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.085 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.085 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.085 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.181 187223 DEBUG nova.compute.manager [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-changed-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.182 187223 DEBUG nova.compute.manager [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Refreshing instance network info cache due to event network-changed-9879f9ed-a8de-478e-b768-7d8bfcc3491f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.182 187223 DEBUG oslo_concurrency.lockutils [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.322 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:09 np0005535656 nova_compute[187219]: 2025-11-25 19:12:09.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.506 187223 DEBUG nova.network.neutron [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updating instance_info_cache with network_info: [{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.535 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.535 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Instance network_info: |[{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.535 187223 DEBUG oslo_concurrency.lockutils [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.536 187223 DEBUG nova.network.neutron [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Refreshing network info cache for port 9879f9ed-a8de-478e-b768-7d8bfcc3491f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.538 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Start _get_guest_xml network_info=[{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.542 187223 WARNING nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.546 187223 DEBUG nova.virt.libvirt.host [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.547 187223 DEBUG nova.virt.libvirt.host [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.549 187223 DEBUG nova.virt.libvirt.host [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.549 187223 DEBUG nova.virt.libvirt.host [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.550 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.550 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.551 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.551 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.551 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.551 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.552 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.552 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.552 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.552 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.552 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.553 187223 DEBUG nova.virt.hardware [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.555 187223 DEBUG nova.virt.libvirt.vif [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:12:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172414304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172414304',id=19,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-gbd47jc0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:06Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4dd056a9-18fa-4489-98e2-0363712f619a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.556 187223 DEBUG nova.network.os_vif_util [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.556 187223 DEBUG nova.network.os_vif_util [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.557 187223 DEBUG nova.objects.instance [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dd056a9-18fa-4489-98e2-0363712f619a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.568 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <uuid>4dd056a9-18fa-4489-98e2-0363712f619a</uuid>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <name>instance-00000013</name>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-172414304</nova:name>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:12:10</nova:creationTime>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        <nova:port uuid="9879f9ed-a8de-478e-b768-7d8bfcc3491f">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="serial">4dd056a9-18fa-4489-98e2-0363712f619a</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="uuid">4dd056a9-18fa-4489-98e2-0363712f619a</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.config"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:5e:81:9b"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <target dev="tap9879f9ed-a8"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/console.log" append="off"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:12:10 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:12:10 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:12:10 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:12:10 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.570 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Preparing to wait for external event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.570 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.570 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.570 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.571 187223 DEBUG nova.virt.libvirt.vif [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:12:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172414304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172414304',id=19,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-gbd47jc0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:06Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4dd056a9-18fa-4489-98e2-0363712f619a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.571 187223 DEBUG nova.network.os_vif_util [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.572 187223 DEBUG nova.network.os_vif_util [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.572 187223 DEBUG os_vif [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.572 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.573 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.573 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.575 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.575 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9879f9ed-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.576 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9879f9ed-a8, col_values=(('external_ids', {'iface-id': '9879f9ed-a8de-478e-b768-7d8bfcc3491f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:81:9b', 'vm-uuid': '4dd056a9-18fa-4489-98e2-0363712f619a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.577 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:10 np0005535656 NetworkManager[55548]: <info>  [1764097930.5789] manager: (tap9879f9ed-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.579 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.586 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.586 187223 INFO os_vif [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8')#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.648 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.648 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.648 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:5e:81:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.649 187223 INFO nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Using config drive#033[00m
Nov 25 14:12:10 np0005535656 nova_compute[187219]: 2025-11-25 19:12:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.201 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.379 187223 INFO nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Creating config drive at /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.config#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.384 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav2211yx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.528 187223 DEBUG oslo_concurrency.processutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav2211yx" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:11 np0005535656 kernel: tap9879f9ed-a8: entered promiscuous mode
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.6136] manager: (tap9879f9ed-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 25 14:12:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:11Z|00146|binding|INFO|Claiming lport 9879f9ed-a8de-478e-b768-7d8bfcc3491f for this chassis.
Nov 25 14:12:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:11Z|00147|binding|INFO|9879f9ed-a8de-478e-b768-7d8bfcc3491f: Claiming fa:16:3e:5e:81:9b 10.100.0.8
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.614 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.626 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:81:9b 10.100.0.8'], port_security=['fa:16:3e:5e:81:9b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4dd056a9-18fa-4489-98e2-0363712f619a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9879f9ed-a8de-478e-b768-7d8bfcc3491f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.629 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9879f9ed-a8de-478e-b768-7d8bfcc3491f in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.630 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:12:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:11Z|00148|binding|INFO|Setting lport 9879f9ed-a8de-478e-b768-7d8bfcc3491f ovn-installed in OVS
Nov 25 14:12:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:11Z|00149|binding|INFO|Setting lport 9879f9ed-a8de-478e-b768-7d8bfcc3491f up in Southbound
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.643 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.651 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.649 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8996d747-f11e-410e-984b-fe6264aedbd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.653 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.655 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.656 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1e7c4f-9dee-416a-be69-5fc0e4ed60bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.658 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1689cf5a-ab67-49ce-8dd5-520cd14aeab5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 systemd-udevd[215713]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:12:11 np0005535656 systemd-machined[153481]: New machine qemu-13-instance-00000013.
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.675 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[e6140165-3d8e-4022-bacb-ee455d03e0c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.6807] device (tap9879f9ed-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:12:11 np0005535656 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.6817] device (tap9879f9ed-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.693 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[48f363d5-61e5-4586-9b7b-3128dd72512e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.732 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7022d5-7261-4939-bb8c-403c4e0de819]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 systemd-udevd[215717]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.7394] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.739 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[90797b8e-247d-4bc2-8eb9-b79200fea259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.778 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b0edf8b2-19a6-4d2e-bbf4-c4832dc8e6bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.783 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c48719-2a91-46db-8ccc-34b7a52c46b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.8182] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.825 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[64ffd859-aefe-427a-ad26-d18b9be9e3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.842 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5e91899a-3951-4f0e-a02d-95978b1386e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489334, 'reachable_time': 16781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215746, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.863 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f02969ea-7217-4df7-a2cc-5a1408f7b058]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489334, 'tstamp': 489334}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215747, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.886 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[89fbfe96-acc6-4a95-9cab-057ed16e1781]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489334, 'reachable_time': 16781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215748, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.923 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f02a4744-ac86-4eb7-8da7-1d8e3f919aec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.975 187223 DEBUG nova.compute.manager [req-d04ad9c0-dc28-4e2c-95d2-cb9b66593ba3 req-abf8bc73-db09-4244-93cf-d66d71a33486 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.976 187223 DEBUG oslo_concurrency.lockutils [req-d04ad9c0-dc28-4e2c-95d2-cb9b66593ba3 req-abf8bc73-db09-4244-93cf-d66d71a33486 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.976 187223 DEBUG oslo_concurrency.lockutils [req-d04ad9c0-dc28-4e2c-95d2-cb9b66593ba3 req-abf8bc73-db09-4244-93cf-d66d71a33486 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.976 187223 DEBUG oslo_concurrency.lockutils [req-d04ad9c0-dc28-4e2c-95d2-cb9b66593ba3 req-abf8bc73-db09-4244-93cf-d66d71a33486 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.977 187223 DEBUG nova.compute.manager [req-d04ad9c0-dc28-4e2c-95d2-cb9b66593ba3 req-abf8bc73-db09-4244-93cf-d66d71a33486 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Processing event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.993 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[247649c9-a202-4902-8dae-6c358beb01c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.994 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.995 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:12:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:11.995 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:11 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:12:11 np0005535656 NetworkManager[55548]: <info>  [1764097931.9978] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 25 14:12:11 np0005535656 nova_compute[187219]: 2025-11-25 19:12:11.997 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:12.005 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.006 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:12 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:12Z|00150|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:12.009 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:12.010 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b406bbd2-b4c5-4e75-8189-b67014ac6468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:12.011 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:12:12 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:12.013 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.018 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.502 187223 DEBUG nova.network.neutron [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updated VIF entry in instance network info cache for port 9879f9ed-a8de-478e-b768-7d8bfcc3491f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.504 187223 DEBUG nova.network.neutron [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updating instance_info_cache with network_info: [{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:12:12 np0005535656 podman[215780]: 2025-11-25 19:12:12.465410209 +0000 UTC m=+0.038163495 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.561 187223 DEBUG oslo_concurrency.lockutils [req-a2f49fc9-2b42-414b-a9c4-8dc63e6b2c18 req-2c983341-7277-4ad5-82eb-837cc4c57f8d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.580 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097932.5796494, 4dd056a9-18fa-4489-98e2-0363712f619a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.580 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] VM Started (Lifecycle Event)#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.583 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.587 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.590 187223 INFO nova.virt.libvirt.driver [-] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Instance spawned successfully.#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.591 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.674 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.679 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.680 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.680 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.681 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.681 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.681 187223 DEBUG nova.virt.libvirt.driver [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.688 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.725 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.726 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.726 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.726 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.759 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.759 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097932.5807045, 4dd056a9-18fa-4489-98e2-0363712f619a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.760 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:12:12 np0005535656 podman[215780]: 2025-11-25 19:12:12.768822125 +0000 UTC m=+0.341575321 container create bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.812 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.817 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764097932.586105, 4dd056a9-18fa-4489-98e2-0363712f619a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.818 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.832 187223 INFO nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Took 6.64 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.833 187223 DEBUG nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.865 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.873 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.876 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:12 np0005535656 systemd[1]: Started libpod-conmon-bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1.scope.
Nov 25 14:12:12 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:12:12 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/557f4e30ba4306f715f43ff6dc86108e59f39a5c46f9f9905960573dc9f17d61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.931 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:12:12 np0005535656 podman[215780]: 2025-11-25 19:12:12.939394599 +0000 UTC m=+0.512147835 container init bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 14:12:12 np0005535656 podman[215780]: 2025-11-25 19:12:12.947335841 +0000 UTC m=+0.520089047 container start bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 25 14:12:12 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [NOTICE]   (215808) : New worker (215812) forked
Nov 25 14:12:12 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [NOTICE]   (215808) : Loading success.
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.975 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:12 np0005535656 nova_compute[187219]: 2025-11-25 19:12:12.976 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.003 187223 INFO nova.compute.manager [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Took 7.82 seconds to build instance.#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.065 187223 DEBUG oslo_concurrency.lockutils [None req-12d950e9-89d2-40fd-9ac6-6eb9522ed900 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.071 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.225 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.226 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5789MB free_disk=73.16210174560547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.227 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.227 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.362 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 4dd056a9-18fa-4489-98e2-0363712f619a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.362 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.362 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.414 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.480 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.577 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:12:13 np0005535656 nova_compute[187219]: 2025-11-25 19:12:13.577 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:13 np0005535656 podman[215824]: 2025-11-25 19:12:13.974328681 +0000 UTC m=+0.076989225 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.107 187223 DEBUG nova.compute.manager [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.107 187223 DEBUG oslo_concurrency.lockutils [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.107 187223 DEBUG oslo_concurrency.lockutils [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.107 187223 DEBUG oslo_concurrency.lockutils [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.108 187223 DEBUG nova.compute.manager [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:12:14 np0005535656 nova_compute[187219]: 2025-11-25 19:12:14.108 187223 WARNING nova.compute.manager [req-aadd7140-b142-4a78-b1a1-68e33e6716e6 req-43390600-60ac-417e-9db4-adabb8ea3dc8 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state None.#033[00m
Nov 25 14:12:15 np0005535656 nova_compute[187219]: 2025-11-25 19:12:15.579 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:16 np0005535656 nova_compute[187219]: 2025-11-25 19:12:16.204 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:12:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:12:20 np0005535656 nova_compute[187219]: 2025-11-25 19:12:20.581 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:21 np0005535656 nova_compute[187219]: 2025-11-25 19:12:21.209 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:23 np0005535656 podman[215852]: 2025-11-25 19:12:23.969510987 +0000 UTC m=+0.067209164 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 25 14:12:23 np0005535656 podman[215851]: 2025-11-25 19:12:23.994489116 +0000 UTC m=+0.103396453 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 25 14:12:25 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:25Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:81:9b 10.100.0.8
Nov 25 14:12:25 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:25Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:81:9b 10.100.0.8
Nov 25 14:12:25 np0005535656 nova_compute[187219]: 2025-11-25 19:12:25.584 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:26 np0005535656 nova_compute[187219]: 2025-11-25 19:12:26.210 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:29 np0005535656 podman[215909]: 2025-11-25 19:12:29.99642244 +0000 UTC m=+0.109016684 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 25 14:12:30 np0005535656 nova_compute[187219]: 2025-11-25 19:12:30.617 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:31 np0005535656 nova_compute[187219]: 2025-11-25 19:12:31.214 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:33 np0005535656 podman[215930]: 2025-11-25 19:12:33.96537055 +0000 UTC m=+0.070321067 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:12:35 np0005535656 nova_compute[187219]: 2025-11-25 19:12:35.620 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:35 np0005535656 podman[197580]: time="2025-11-25T19:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:12:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:12:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 25 14:12:36 np0005535656 nova_compute[187219]: 2025-11-25 19:12:36.217 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:40 np0005535656 nova_compute[187219]: 2025-11-25 19:12:40.622 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:41 np0005535656 nova_compute[187219]: 2025-11-25 19:12:41.219 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:44 np0005535656 podman[215950]: 2025-11-25 19:12:44.97990234 +0000 UTC m=+0.080259094 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:12:45 np0005535656 nova_compute[187219]: 2025-11-25 19:12:45.625 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:46 np0005535656 nova_compute[187219]: 2025-11-25 19:12:46.220 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:12:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:12:50 np0005535656 nova_compute[187219]: 2025-11-25 19:12:50.627 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:51 np0005535656 nova_compute[187219]: 2025-11-25 19:12:51.222 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:53 np0005535656 ovn_controller[95460]: 2025-11-25T19:12:53Z|00151|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 25 14:12:54 np0005535656 podman[215975]: 2025-11-25 19:12:54.985541885 +0000 UTC m=+0.085301758 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:12:55 np0005535656 podman[215974]: 2025-11-25 19:12:55.0330761 +0000 UTC m=+0.141513646 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:12:55 np0005535656 nova_compute[187219]: 2025-11-25 19:12:55.629 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:56 np0005535656 nova_compute[187219]: 2025-11-25 19:12:56.224 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:12:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:59.089 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:12:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:59.091 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:12:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:12:59.091 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:00 np0005535656 nova_compute[187219]: 2025-11-25 19:13:00.631 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:00 np0005535656 podman[216022]: 2025-11-25 19:13:00.966515108 +0000 UTC m=+0.085808332 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 14:13:01 np0005535656 nova_compute[187219]: 2025-11-25 19:13:01.255 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:04 np0005535656 nova_compute[187219]: 2025-11-25 19:13:04.576 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:04 np0005535656 podman[216043]: 2025-11-25 19:13:04.965685278 +0000 UTC m=+0.079888324 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 25 14:13:05 np0005535656 nova_compute[187219]: 2025-11-25 19:13:05.634 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:05 np0005535656 podman[197580]: time="2025-11-25T19:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:13:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:13:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 25 14:13:05 np0005535656 nova_compute[187219]: 2025-11-25 19:13:05.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:05 np0005535656 nova_compute[187219]: 2025-11-25 19:13:05.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:13:05 np0005535656 nova_compute[187219]: 2025-11-25 19:13:05.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:13:06 np0005535656 nova_compute[187219]: 2025-11-25 19:13:06.124 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:13:06 np0005535656 nova_compute[187219]: 2025-11-25 19:13:06.124 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:13:06 np0005535656 nova_compute[187219]: 2025-11-25 19:13:06.125 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:13:06 np0005535656 nova_compute[187219]: 2025-11-25 19:13:06.125 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4dd056a9-18fa-4489-98e2-0363712f619a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:13:06 np0005535656 nova_compute[187219]: 2025-11-25 19:13:06.300 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:07 np0005535656 nova_compute[187219]: 2025-11-25 19:13:07.486 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updating instance_info_cache with network_info: [{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:13:07 np0005535656 nova_compute[187219]: 2025-11-25 19:13:07.511 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:13:07 np0005535656 nova_compute[187219]: 2025-11-25 19:13:07.512 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:13:07 np0005535656 nova_compute[187219]: 2025-11-25 19:13:07.513 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:09 np0005535656 nova_compute[187219]: 2025-11-25 19:13:09.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:09 np0005535656 nova_compute[187219]: 2025-11-25 19:13:09.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:10 np0005535656 nova_compute[187219]: 2025-11-25 19:13:10.638 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:10 np0005535656 nova_compute[187219]: 2025-11-25 19:13:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:11 np0005535656 nova_compute[187219]: 2025-11-25 19:13:11.302 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:11 np0005535656 nova_compute[187219]: 2025-11-25 19:13:11.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:11 np0005535656 nova_compute[187219]: 2025-11-25 19:13:11.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:13:12 np0005535656 nova_compute[187219]: 2025-11-25 19:13:12.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:12 np0005535656 nova_compute[187219]: 2025-11-25 19:13:12.686 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:12 np0005535656 nova_compute[187219]: 2025-11-25 19:13:12.887 187223 DEBUG nova.compute.manager [None req-41bd3e69-f1be-448a-a056-d2f8d0c4c8b2 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610#033[00m
Nov 25 14:13:12 np0005535656 nova_compute[187219]: 2025-11-25 19:13:12.943 187223 DEBUG nova.compute.provider_tree [None req-41bd3e69-f1be-448a-a056-d2f8d0c4c8b2 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 28 to 35 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.693 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.694 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.694 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.694 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.763 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.858 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.859 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:13:13 np0005535656 nova_compute[187219]: 2025-11-25 19:13:13.951 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.158 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.160 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5708MB free_disk=73.13368606567383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.160 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.160 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.255 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 4dd056a9-18fa-4489-98e2-0363712f619a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.256 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.256 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.275 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.295 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.295 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.333 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.355 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STATUS_DISABLED,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.405 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.424 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.425 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:13:14 np0005535656 nova_compute[187219]: 2025-11-25 19:13:14.426 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:15 np0005535656 nova_compute[187219]: 2025-11-25 19:13:15.640 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:15 np0005535656 podman[216069]: 2025-11-25 19:13:15.968346598 +0000 UTC m=+0.077784237 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 14:13:17 np0005535656 nova_compute[187219]: 2025-11-25 19:13:17.489 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:18 np0005535656 nova_compute[187219]: 2025-11-25 19:13:18.425 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Check if temp file /var/lib/nova/instances/tmp4zmlhf9s exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:13:18 np0005535656 nova_compute[187219]: 2025-11-25 19:13:18.426 187223 DEBUG nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zmlhf9s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4dd056a9-18fa-4489-98e2-0363712f619a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:13:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:13:19 np0005535656 nova_compute[187219]: 2025-11-25 19:13:19.494 187223 DEBUG oslo_concurrency.processutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:13:19 np0005535656 nova_compute[187219]: 2025-11-25 19:13:19.554 187223 DEBUG oslo_concurrency.processutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:13:19 np0005535656 nova_compute[187219]: 2025-11-25 19:13:19.556 187223 DEBUG oslo_concurrency.processutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:13:19 np0005535656 nova_compute[187219]: 2025-11-25 19:13:19.620 187223 DEBUG oslo_concurrency.processutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:13:20 np0005535656 nova_compute[187219]: 2025-11-25 19:13:20.643 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:21 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:13:21 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:13:21 np0005535656 systemd-logind[788]: New session 38 of user nova.
Nov 25 14:13:21 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:13:21 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:13:22 np0005535656 systemd[216104]: Queued start job for default target Main User Target.
Nov 25 14:13:22 np0005535656 systemd[216104]: Created slice User Application Slice.
Nov 25 14:13:22 np0005535656 systemd[216104]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:13:22 np0005535656 systemd[216104]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:13:22 np0005535656 systemd[216104]: Reached target Paths.
Nov 25 14:13:22 np0005535656 systemd[216104]: Reached target Timers.
Nov 25 14:13:22 np0005535656 systemd[216104]: Starting D-Bus User Message Bus Socket...
Nov 25 14:13:22 np0005535656 systemd[216104]: Starting Create User's Volatile Files and Directories...
Nov 25 14:13:22 np0005535656 systemd[216104]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:13:22 np0005535656 systemd[216104]: Reached target Sockets.
Nov 25 14:13:22 np0005535656 systemd[216104]: Finished Create User's Volatile Files and Directories.
Nov 25 14:13:22 np0005535656 systemd[216104]: Reached target Basic System.
Nov 25 14:13:22 np0005535656 systemd[216104]: Reached target Main User Target.
Nov 25 14:13:22 np0005535656 systemd[216104]: Startup finished in 195ms.
Nov 25 14:13:22 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:13:22 np0005535656 systemd[1]: Started Session 38 of User nova.
Nov 25 14:13:22 np0005535656 systemd[1]: session-38.scope: Deactivated successfully.
Nov 25 14:13:22 np0005535656 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Nov 25 14:13:22 np0005535656 systemd-logind[788]: Removed session 38.
Nov 25 14:13:22 np0005535656 nova_compute[187219]: 2025-11-25 19:13:22.493 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.419 187223 DEBUG nova.compute.manager [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.419 187223 DEBUG oslo_concurrency.lockutils [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.420 187223 DEBUG oslo_concurrency.lockutils [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.420 187223 DEBUG oslo_concurrency.lockutils [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.420 187223 DEBUG nova.compute.manager [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:24 np0005535656 nova_compute[187219]: 2025-11-25 19:13:24.420 187223 DEBUG nova.compute.manager [req-50c5ad8c-e018-441f-855b-078f2a78e528 req-59cbd5ca-ade5-424a-9b40-40509c59ae1a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:13:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:25.166 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:13:25 np0005535656 nova_compute[187219]: 2025-11-25 19:13:25.167 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:25.167 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:13:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:25.168 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:13:25 np0005535656 nova_compute[187219]: 2025-11-25 19:13:25.646 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:25 np0005535656 podman[216122]: 2025-11-25 19:13:25.980077396 +0000 UTC m=+0.098825940 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 25 14:13:26 np0005535656 podman[216121]: 2025-11-25 19:13:26.026400358 +0000 UTC m=+0.140704703 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.493 187223 DEBUG nova.compute.manager [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.494 187223 DEBUG oslo_concurrency.lockutils [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.494 187223 DEBUG oslo_concurrency.lockutils [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.495 187223 DEBUG oslo_concurrency.lockutils [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.495 187223 DEBUG nova.compute.manager [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.496 187223 WARNING nova.compute.manager [req-bd62c37b-a7b7-4370-9279-9f453cb5a4b0 req-23e5c2d2-6fe4-4079-8c80-9dc8f1f5d71f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.585 187223 INFO nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Took 6.96 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.586 187223 DEBUG nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.605 187223 DEBUG nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zmlhf9s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4dd056a9-18fa-4489-98e2-0363712f619a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(aba3d19c-da76-4dcf-bab1-d41669693cd5),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.634 187223 DEBUG nova.objects.instance [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 4dd056a9-18fa-4489-98e2-0363712f619a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.635 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.637 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.638 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.655 187223 DEBUG nova.virt.libvirt.vif [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:12:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172414304',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172414304',id=19,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:12:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-gbd47jc0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:12:12Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4dd056a9-18fa-4489-98e2-0363712f619a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.656 187223 DEBUG nova.network.os_vif_util [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.657 187223 DEBUG nova.network.os_vif_util [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.658 187223 DEBUG nova.virt.libvirt.migration [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:13:26 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:5e:81:9b"/>
Nov 25 14:13:26 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:13:26 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:13:26 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:13:26 np0005535656 nova_compute[187219]:  <target dev="tap9879f9ed-a8"/>
Nov 25 14:13:26 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:13:26 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:13:26 np0005535656 nova_compute[187219]: 2025-11-25 19:13:26.659 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.141 187223 DEBUG nova.virt.libvirt.migration [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.142 187223 INFO nova.virt.libvirt.migration [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.259 187223 INFO nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.494 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.763 187223 DEBUG nova.virt.libvirt.migration [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:13:27 np0005535656 nova_compute[187219]: 2025-11-25 19:13:27.764 187223 DEBUG nova.virt.libvirt.migration [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.001 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098008.0005212, 4dd056a9-18fa-4489-98e2-0363712f619a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.002 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.022 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.029 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.059 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:13:28 np0005535656 kernel: tap9879f9ed-a8 (unregistering): left promiscuous mode
Nov 25 14:13:28 np0005535656 NetworkManager[55548]: <info>  [1764098008.1862] device (tap9879f9ed-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.194 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:28 np0005535656 ovn_controller[95460]: 2025-11-25T19:13:28Z|00152|binding|INFO|Releasing lport 9879f9ed-a8de-478e-b768-7d8bfcc3491f from this chassis (sb_readonly=0)
Nov 25 14:13:28 np0005535656 ovn_controller[95460]: 2025-11-25T19:13:28Z|00153|binding|INFO|Setting lport 9879f9ed-a8de-478e-b768-7d8bfcc3491f down in Southbound
Nov 25 14:13:28 np0005535656 ovn_controller[95460]: 2025-11-25T19:13:28Z|00154|binding|INFO|Removing iface tap9879f9ed-a8 ovn-installed in OVS
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.213 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.214 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:81:9b 10.100.0.8'], port_security=['fa:16:3e:5e:81:9b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4dd056a9-18fa-4489-98e2-0363712f619a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=9879f9ed-a8de-478e-b768-7d8bfcc3491f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.218 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 9879f9ed-a8de-478e-b768-7d8bfcc3491f in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.221 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.223 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[92149169-1f13-4398-8893-cab2a4ee9c0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.224 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:13:28 np0005535656 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 25 14:13:28 np0005535656 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 17.489s CPU time.
Nov 25 14:13:28 np0005535656 systemd-machined[153481]: Machine qemu-13-instance-00000013 terminated.
Nov 25 14:13:28 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [NOTICE]   (215808) : haproxy version is 2.8.14-c23fe91
Nov 25 14:13:28 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [NOTICE]   (215808) : path to executable is /usr/sbin/haproxy
Nov 25 14:13:28 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [WARNING]  (215808) : Exiting Master process...
Nov 25 14:13:28 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [ALERT]    (215808) : Current worker (215812) exited with code 143 (Terminated)
Nov 25 14:13:28 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[215803]: [WARNING]  (215808) : All workers exited. Exiting... (0)
Nov 25 14:13:28 np0005535656 systemd[1]: libpod-bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1.scope: Deactivated successfully.
Nov 25 14:13:28 np0005535656 podman[216207]: 2025-11-25 19:13:28.422411859 +0000 UTC m=+0.070278695 container died bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.442 187223 DEBUG nova.virt.libvirt.guest [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.443 187223 INFO nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migration operation has completed#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.443 187223 INFO nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] _post_live_migration() is started..#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.450 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.451 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.451 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:13:28 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1-userdata-shm.mount: Deactivated successfully.
Nov 25 14:13:28 np0005535656 systemd[1]: var-lib-containers-storage-overlay-557f4e30ba4306f715f43ff6dc86108e59f39a5c46f9f9905960573dc9f17d61-merged.mount: Deactivated successfully.
Nov 25 14:13:28 np0005535656 podman[216207]: 2025-11-25 19:13:28.466406228 +0000 UTC m=+0.114273064 container cleanup bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:13:28 np0005535656 systemd[1]: libpod-conmon-bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1.scope: Deactivated successfully.
Nov 25 14:13:28 np0005535656 podman[216251]: 2025-11-25 19:13:28.535416869 +0000 UTC m=+0.043879618 container remove bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.541 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[55359be3-fb3e-4ce0-ba59-af2c079e4eae]: (4, ('Tue Nov 25 07:13:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1)\nbf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1\nTue Nov 25 07:13:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (bf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1)\nbf2b6fbc75c020c086a071bc2603cfcdd37b771886d63d3f1c7d9c3ba8220ac1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.543 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2883ac2c-ce6e-4585-83bd-41bcb3ad75ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.545 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:13:28 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.547 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.562 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.565 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d779f2-edd0-4027-b954-4ada1cd87515]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.589 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecb3c41-2980-4b05-88c3-cea5ab8e1ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.591 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6186646a-03d9-4381-84ba-cef8eabbff28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.602 187223 DEBUG nova.compute.manager [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-changed-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.602 187223 DEBUG nova.compute.manager [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Refreshing instance network info cache due to event network-changed-9879f9ed-a8de-478e-b768-7d8bfcc3491f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.602 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.602 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:13:28 np0005535656 nova_compute[187219]: 2025-11-25 19:13:28.603 187223 DEBUG nova.network.neutron [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Refreshing network info cache for port 9879f9ed-a8de-478e-b768-7d8bfcc3491f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.607 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4ab4f0-9712-45ae-ab61-3a9f34eac742]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489324, 'reachable_time': 39247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216270, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.612 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:13:28 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:13:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:28.612 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[16c346f1-21a7-4002-af59-2070101d0a58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.159 187223 DEBUG nova.compute.manager [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.159 187223 DEBUG oslo_concurrency.lockutils [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.159 187223 DEBUG oslo_concurrency.lockutils [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.159 187223 DEBUG oslo_concurrency.lockutils [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.160 187223 DEBUG nova.compute.manager [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.160 187223 DEBUG nova.compute.manager [req-fcf143ab-ee98-418e-941e-6f647dad01c1 req-ff271051-a8f4-4d32-aa16-fb6f00b76c81 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.252 187223 DEBUG nova.network.neutron [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 9879f9ed-a8de-478e-b768-7d8bfcc3491f and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.252 187223 DEBUG nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.253 187223 DEBUG nova.virt.libvirt.vif [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:12:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-172414304',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-172414304',id=19,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:12:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-gbd47jc0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:13:15Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4dd056a9-18fa-4489-98e2-0363712f619a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.253 187223 DEBUG nova.network.os_vif_util [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.254 187223 DEBUG nova.network.os_vif_util [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.254 187223 DEBUG os_vif [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.255 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.255 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9879f9ed-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.257 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.258 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.260 187223 INFO os_vif [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:81:9b,bridge_name='br-int',has_traffic_filtering=True,id=9879f9ed-a8de-478e-b768-7d8bfcc3491f,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9879f9ed-a8')#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.260 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.260 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.260 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.260 187223 DEBUG nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.261 187223 INFO nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Deleting instance files /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a_del#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.261 187223 INFO nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Deletion of /var/lib/nova/instances/4dd056a9-18fa-4489-98e2-0363712f619a_del complete#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.964 187223 DEBUG nova.network.neutron [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updated VIF entry in instance network info cache for port 9879f9ed-a8de-478e-b768-7d8bfcc3491f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.964 187223 DEBUG nova.network.neutron [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Updating instance_info_cache with network_info: [{"id": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "address": "fa:16:3e:5e:81:9b", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9879f9ed-a8", "ovs_interfaceid": "9879f9ed-a8de-478e-b768-7d8bfcc3491f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.997 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-4dd056a9-18fa-4489-98e2-0363712f619a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.998 187223 DEBUG nova.compute.manager [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.998 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.998 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.999 187223 DEBUG oslo_concurrency.lockutils [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.999 187223 DEBUG nova.compute.manager [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:29 np0005535656 nova_compute[187219]: 2025-11-25 19:13:29.999 187223 DEBUG nova.compute.manager [req-06c6003a-6722-460a-af9b-2aed85d399b0 req-4fd366a3-a666-4f38-8342-063263b33426 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-unplugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.927 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.928 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.928 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.928 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.929 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.929 187223 WARNING nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.929 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.929 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.929 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.930 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.930 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.930 187223 WARNING nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.930 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.930 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.931 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.931 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.931 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.931 187223 WARNING nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 DEBUG oslo_concurrency.lockutils [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 DEBUG nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] No waiting events found dispatching network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:13:30 np0005535656 nova_compute[187219]: 2025-11-25 19:13:30.932 187223 WARNING nova.compute.manager [req-86a9a03a-e8dc-4ed3-a71b-7f5b7945a852 req-caab4021-9127-42bc-bb27-35b1241b2b07 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Received unexpected event network-vif-plugged-9879f9ed-a8de-478e-b768-7d8bfcc3491f for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:13:31 np0005535656 podman[216271]: 2025-11-25 19:13:31.986510721 +0000 UTC m=+0.101507472 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 14:13:32 np0005535656 nova_compute[187219]: 2025-11-25 19:13:32.497 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:32 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:13:32 np0005535656 systemd[216104]: Activating special unit Exit the Session...
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped target Main User Target.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped target Basic System.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped target Paths.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped target Sockets.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped target Timers.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:13:32 np0005535656 systemd[216104]: Closed D-Bus User Message Bus Socket.
Nov 25 14:13:32 np0005535656 systemd[216104]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:13:32 np0005535656 systemd[216104]: Removed slice User Application Slice.
Nov 25 14:13:32 np0005535656 systemd[216104]: Reached target Shutdown.
Nov 25 14:13:32 np0005535656 systemd[216104]: Finished Exit the Session.
Nov 25 14:13:32 np0005535656 systemd[216104]: Reached target Exit the Session.
Nov 25 14:13:32 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:13:32 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:13:32 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:13:32 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:13:32 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:13:32 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:13:32 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:13:34 np0005535656 nova_compute[187219]: 2025-11-25 19:13:34.258 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:35 np0005535656 podman[197580]: time="2025-11-25T19:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:13:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:13:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 25 14:13:35 np0005535656 podman[216296]: 2025-11-25 19:13:35.981698575 +0000 UTC m=+0.092467511 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.498 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.505 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.505 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.506 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4dd056a9-18fa-4489-98e2-0363712f619a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.534 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.535 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.535 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.536 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.755 187223 WARNING nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.757 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5875MB free_disk=73.16293334960938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.758 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.758 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.795 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance 4dd056a9-18fa-4489-98e2-0363712f619a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.817 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.909 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration aba3d19c-da76-4dcf-bab1-d41669693cd5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.910 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.910 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.961 187223 DEBUG nova.compute.provider_tree [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:13:37 np0005535656 nova_compute[187219]: 2025-11-25 19:13:37.991 187223 DEBUG nova.scheduler.client.report [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:13:38 np0005535656 nova_compute[187219]: 2025-11-25 19:13:38.025 187223 DEBUG nova.compute.resource_tracker [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:13:38 np0005535656 nova_compute[187219]: 2025-11-25 19:13:38.025 187223 DEBUG oslo_concurrency.lockutils [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:38 np0005535656 nova_compute[187219]: 2025-11-25 19:13:38.030 187223 INFO nova.compute.manager [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:13:38 np0005535656 nova_compute[187219]: 2025-11-25 19:13:38.117 187223 INFO nova.scheduler.client.report [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration aba3d19c-da76-4dcf-bab1-d41669693cd5#033[00m
Nov 25 14:13:38 np0005535656 nova_compute[187219]: 2025-11-25 19:13:38.118 187223 DEBUG nova.virt.libvirt.driver [None req-ef72ce5e-734d-43c6-b28a-2b0b28199caa fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:13:39 np0005535656 nova_compute[187219]: 2025-11-25 19:13:39.261 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:42 np0005535656 nova_compute[187219]: 2025-11-25 19:13:42.500 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:43 np0005535656 nova_compute[187219]: 2025-11-25 19:13:43.443 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098008.4419074, 4dd056a9-18fa-4489-98e2-0363712f619a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:13:43 np0005535656 nova_compute[187219]: 2025-11-25 19:13:43.443 187223 INFO nova.compute.manager [-] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:13:43 np0005535656 nova_compute[187219]: 2025-11-25 19:13:43.467 187223 DEBUG nova.compute.manager [None req-6d0e1811-a058-44d0-ac10-1faa1edcb3c1 - - - - - -] [instance: 4dd056a9-18fa-4489-98e2-0363712f619a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:13:44 np0005535656 nova_compute[187219]: 2025-11-25 19:13:44.263 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:46 np0005535656 podman[216317]: 2025-11-25 19:13:46.920912623 +0000 UTC m=+0.043947460 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:13:47 np0005535656 nova_compute[187219]: 2025-11-25 19:13:47.502 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:49 np0005535656 nova_compute[187219]: 2025-11-25 19:13:49.313 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:13:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:13:52 np0005535656 nova_compute[187219]: 2025-11-25 19:13:52.504 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:54 np0005535656 nova_compute[187219]: 2025-11-25 19:13:54.315 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:56 np0005535656 podman[216342]: 2025-11-25 19:13:56.946384221 +0000 UTC m=+0.060328119 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 25 14:13:56 np0005535656 podman[216341]: 2025-11-25 19:13:56.993421972 +0000 UTC m=+0.109383075 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 25 14:13:57 np0005535656 nova_compute[187219]: 2025-11-25 19:13:57.508 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:13:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:59.090 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:13:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:59.090 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:13:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:13:59.091 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:13:59 np0005535656 nova_compute[187219]: 2025-11-25 19:13:59.318 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:02 np0005535656 nova_compute[187219]: 2025-11-25 19:14:02.509 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:02 np0005535656 podman[216382]: 2025-11-25 19:14:02.952967607 +0000 UTC m=+0.073857605 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 25 14:14:04 np0005535656 nova_compute[187219]: 2025-11-25 19:14:04.320 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:05 np0005535656 podman[197580]: time="2025-11-25T19:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:14:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:14:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 25 14:14:06 np0005535656 nova_compute[187219]: 2025-11-25 19:14:06.425 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:06 np0005535656 nova_compute[187219]: 2025-11-25 19:14:06.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:06 np0005535656 nova_compute[187219]: 2025-11-25 19:14:06.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:14:06 np0005535656 nova_compute[187219]: 2025-11-25 19:14:06.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:14:06 np0005535656 nova_compute[187219]: 2025-11-25 19:14:06.707 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:14:06 np0005535656 podman[216405]: 2025-11-25 19:14:06.95862416 +0000 UTC m=+0.078322435 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 25 14:14:07 np0005535656 nova_compute[187219]: 2025-11-25 19:14:07.511 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:07 np0005535656 nova_compute[187219]: 2025-11-25 19:14:07.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:09 np0005535656 nova_compute[187219]: 2025-11-25 19:14:09.262 187223 DEBUG nova.compute.manager [None req-305490ac-386d-4147-9ba3-8d8a4603cc40 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606#033[00m
Nov 25 14:14:09 np0005535656 nova_compute[187219]: 2025-11-25 19:14:09.322 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:09 np0005535656 nova_compute[187219]: 2025-11-25 19:14:09.336 187223 DEBUG nova.compute.provider_tree [None req-305490ac-386d-4147-9ba3-8d8a4603cc40 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Updating resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea generation from 35 to 38 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 25 14:14:09 np0005535656 nova_compute[187219]: 2025-11-25 19:14:09.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:10 np0005535656 nova_compute[187219]: 2025-11-25 19:14:10.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:10 np0005535656 nova_compute[187219]: 2025-11-25 19:14:10.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:12 np0005535656 nova_compute[187219]: 2025-11-25 19:14:12.512 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.107 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.108 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.132 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.237 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.238 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.248 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.248 187223 INFO nova.compute.claims [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.374 187223 DEBUG nova.compute.provider_tree [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.388 187223 DEBUG nova.scheduler.client.report [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.414 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.415 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.466 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.467 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.491 187223 INFO nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.513 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.627 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.629 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.630 187223 INFO nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Creating image(s)#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.631 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.631 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.632 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.657 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.682 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.683 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.683 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.746 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.747 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.748 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.762 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.829 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.830 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.864 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.865 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.866 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.918 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.919 187223 DEBUG nova.virt.disk.api [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.919 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.974 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.976 187223 DEBUG nova.virt.disk.api [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:14:13 np0005535656 nova_compute[187219]: 2025-11-25 19:14:13.977 187223 DEBUG nova.objects.instance [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e32bc34-e262-44f0-b382-e97dd53aa66c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.001 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.002 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Ensure instance console log exists: /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.003 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.004 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.004 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.288 187223 DEBUG nova.policy [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.324 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.714 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.715 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.715 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.716 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.949 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.950 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5871MB free_disk=73.16272354125977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.951 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:14 np0005535656 nova_compute[187219]: 2025-11-25 19:14:14.951 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.068 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 4e32bc34-e262-44f0-b382-e97dd53aa66c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.069 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.069 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.145 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.180 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.206 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.207 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:15 np0005535656 nova_compute[187219]: 2025-11-25 19:14:15.454 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Successfully created port: 8cc08f3a-912b-45b3-9b53-ad9415a67906 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.741 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Successfully updated port: 8cc08f3a-912b-45b3-9b53-ad9415a67906 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.761 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.762 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.762 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.883 187223 DEBUG nova.compute.manager [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-changed-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.884 187223 DEBUG nova.compute.manager [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Refreshing instance network info cache due to event network-changed-8cc08f3a-912b-45b3-9b53-ad9415a67906. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.884 187223 DEBUG oslo_concurrency.lockutils [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:14:16 np0005535656 nova_compute[187219]: 2025-11-25 19:14:16.936 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:14:17 np0005535656 nova_compute[187219]: 2025-11-25 19:14:17.627 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:17 np0005535656 podman[216440]: 2025-11-25 19:14:17.922611748 +0000 UTC m=+0.052187863 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.151 187223 DEBUG nova.network.neutron [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updating instance_info_cache with network_info: [{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.209 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.209 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Instance network_info: |[{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.211 187223 DEBUG oslo_concurrency.lockutils [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.212 187223 DEBUG nova.network.neutron [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Refreshing network info cache for port 8cc08f3a-912b-45b3-9b53-ad9415a67906 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.217 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Start _get_guest_xml network_info=[{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.224 187223 WARNING nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.231 187223 DEBUG nova.virt.libvirt.host [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.232 187223 DEBUG nova.virt.libvirt.host [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.236 187223 DEBUG nova.virt.libvirt.host [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.237 187223 DEBUG nova.virt.libvirt.host [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.240 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.240 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.241 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.242 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.243 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.243 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.244 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.244 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.245 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.245 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.246 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.246 187223 DEBUG nova.virt.hardware [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.253 187223 DEBUG nova.virt.libvirt.vif [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-820432432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-820432432',id=21,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-0vk8cqaj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:13Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4e32bc34-e262-44f0-b382-e97dd53aa66c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.254 187223 DEBUG nova.network.os_vif_util [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.255 187223 DEBUG nova.network.os_vif_util [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.257 187223 DEBUG nova.objects.instance [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e32bc34-e262-44f0-b382-e97dd53aa66c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.273 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <uuid>4e32bc34-e262-44f0-b382-e97dd53aa66c</uuid>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <name>instance-00000015</name>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-820432432</nova:name>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:14:18</nova:creationTime>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        <nova:port uuid="8cc08f3a-912b-45b3-9b53-ad9415a67906">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="serial">4e32bc34-e262-44f0-b382-e97dd53aa66c</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="uuid">4e32bc34-e262-44f0-b382-e97dd53aa66c</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.config"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:80:c0:fb"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <target dev="tap8cc08f3a-91"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/console.log" append="off"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:14:18 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:14:18 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:14:18 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:14:18 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.275 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Preparing to wait for external event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.276 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.276 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.277 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.277 187223 DEBUG nova.virt.libvirt.vif [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-820432432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-820432432',id=21,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-0vk8cqaj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:13Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4e32bc34-e262-44f0-b382-e97dd53aa66c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.278 187223 DEBUG nova.network.os_vif_util [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.278 187223 DEBUG nova.network.os_vif_util [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.279 187223 DEBUG os_vif [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.280 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.280 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.281 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.284 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.285 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cc08f3a-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.285 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8cc08f3a-91, col_values=(('external_ids', {'iface-id': '8cc08f3a-912b-45b3-9b53-ad9415a67906', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:c0:fb', 'vm-uuid': '4e32bc34-e262-44f0-b382-e97dd53aa66c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.287 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:18 np0005535656 NetworkManager[55548]: <info>  [1764098058.2882] manager: (tap8cc08f3a-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.292 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.295 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.296 187223 INFO os_vif [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91')#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.351 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.351 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.352 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:80:c0:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:14:18 np0005535656 nova_compute[187219]: 2025-11-25 19:14:18.353 187223 INFO nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Using config drive#033[00m
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.417 187223 INFO nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Creating config drive at /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.config#033[00m
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:14:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.429 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp43tmv0gz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.578 187223 DEBUG oslo_concurrency.processutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp43tmv0gz" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:19 np0005535656 kernel: tap8cc08f3a-91: entered promiscuous mode
Nov 25 14:14:19 np0005535656 NetworkManager[55548]: <info>  [1764098059.6487] manager: (tap8cc08f3a-91): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Nov 25 14:14:19 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:19Z|00155|binding|INFO|Claiming lport 8cc08f3a-912b-45b3-9b53-ad9415a67906 for this chassis.
Nov 25 14:14:19 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:19Z|00156|binding|INFO|8cc08f3a-912b-45b3-9b53-ad9415a67906: Claiming fa:16:3e:80:c0:fb 10.100.0.12
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.650 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:19 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:19Z|00157|binding|INFO|Setting lport 8cc08f3a-912b-45b3-9b53-ad9415a67906 ovn-installed in OVS
Nov 25 14:14:19 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:19Z|00158|binding|INFO|Setting lport 8cc08f3a-912b-45b3-9b53-ad9415a67906 up in Southbound
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.662 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:c0:fb 10.100.0.12'], port_security=['fa:16:3e:80:c0:fb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4e32bc34-e262-44f0-b382-e97dd53aa66c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=8cc08f3a-912b-45b3-9b53-ad9415a67906) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.662 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.665 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 8cc08f3a-912b-45b3-9b53-ad9415a67906 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.668 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.668 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:19 np0005535656 nova_compute[187219]: 2025-11-25 19:14:19.675 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.679 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4eb86a-7f99-4e9e-bac5-6a280fa65e23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.680 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.685 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.685 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2d9330-0523-4834-afa6-947037e3906c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.686 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ec335a-0a0c-4aea-812d-486d23f92fbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 systemd-udevd[216484]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.700 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c10d37-beba-45dd-a50b-eb7700ecc8a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 systemd-machined[153481]: New machine qemu-14-instance-00000015.
Nov 25 14:14:19 np0005535656 NetworkManager[55548]: <info>  [1764098059.7098] device (tap8cc08f3a-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:14:19 np0005535656 NetworkManager[55548]: <info>  [1764098059.7108] device (tap8cc08f3a-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.725 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e440ce-d1c4-43f0-a9b8-d7486a193a5a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 systemd[1]: Started Virtual Machine qemu-14-instance-00000015.
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.780 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[e44240a6-9a1e-48d4-9d36-a7336fdeb9e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.790 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f38cdf3c-1701-45b5-8e3b-6bf3955e9d92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 NetworkManager[55548]: <info>  [1764098059.7928] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.837 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9b50c5-e391-4cc4-bc00-55c2c90d2823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.843 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ca16d7-608e-412a-afea-832017aeccc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 NetworkManager[55548]: <info>  [1764098059.8838] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.892 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[2be9818a-4a6b-45ca-94e4-a4976ae310ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.914 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb42be9-d074-4dcf-9eab-bd906b237cd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502140, 'reachable_time': 18056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216517, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.933 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[fe97f3e7-31f3-44c8-a28f-bc7c0c6593af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502140, 'tstamp': 502140}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216518, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.954 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e91749-ebd4-497e-8934-c4707bbfb272]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502140, 'reachable_time': 18056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216519, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:19 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:19.989 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[86fa3d38-c7b1-45e2-b575-beb539dcf35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.055 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4c203d-cd0c-4499-b687-3a4e54ee10e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.057 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.057 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.059 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.062 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:20 np0005535656 NetworkManager[55548]: <info>  [1764098060.0622] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 25 14:14:20 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.066 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.067 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:20 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:20Z|00159|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.069 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.070 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9e1fea-0f88-45a2-90f2-4b1da3aeb3aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.072 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:14:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:20.073 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.079 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.169 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098060.1685817, 4e32bc34-e262-44f0-b382-e97dd53aa66c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.169 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] VM Started (Lifecycle Event)#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.196 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.201 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098060.1687093, 4e32bc34-e262-44f0-b382-e97dd53aa66c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.201 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.229 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.232 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.265 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.423 187223 DEBUG nova.compute.manager [req-e0d56da5-1492-40be-8201-f812a37290ad req-b36a69bf-8c82-4e3e-a345-447e88404438 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.424 187223 DEBUG oslo_concurrency.lockutils [req-e0d56da5-1492-40be-8201-f812a37290ad req-b36a69bf-8c82-4e3e-a345-447e88404438 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.424 187223 DEBUG oslo_concurrency.lockutils [req-e0d56da5-1492-40be-8201-f812a37290ad req-b36a69bf-8c82-4e3e-a345-447e88404438 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.425 187223 DEBUG oslo_concurrency.lockutils [req-e0d56da5-1492-40be-8201-f812a37290ad req-b36a69bf-8c82-4e3e-a345-447e88404438 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.425 187223 DEBUG nova.compute.manager [req-e0d56da5-1492-40be-8201-f812a37290ad req-b36a69bf-8c82-4e3e-a345-447e88404438 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Processing event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.426 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.429 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098060.4290514, 4e32bc34-e262-44f0-b382-e97dd53aa66c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.429 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.431 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.434 187223 INFO nova.virt.libvirt.driver [-] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Instance spawned successfully.#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.435 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:14:20 np0005535656 podman[216558]: 2025-11-25 19:14:20.447554842 +0000 UTC m=+0.054697410 container create 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.455 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.461 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.465 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.466 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.466 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.467 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.467 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.468 187223 DEBUG nova.virt.libvirt.driver [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:14:20 np0005535656 systemd[1]: Started libpod-conmon-9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a.scope.
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.494 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:14:20 np0005535656 podman[216558]: 2025-11-25 19:14:20.418899921 +0000 UTC m=+0.026042509 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:14:20 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:14:20 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d745ab2bfd12587934e1a309cc0773a91c46f0110fc28ce788ddb97039b41d9f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.525 187223 INFO nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Took 6.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.526 187223 DEBUG nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:14:20 np0005535656 podman[216558]: 2025-11-25 19:14:20.535287858 +0000 UTC m=+0.142430436 container init 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 14:14:20 np0005535656 podman[216558]: 2025-11-25 19:14:20.54278308 +0000 UTC m=+0.149925648 container start 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 14:14:20 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [NOTICE]   (216577) : New worker (216579) forked
Nov 25 14:14:20 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [NOTICE]   (216577) : Loading success.
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.590 187223 INFO nova.compute.manager [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Took 7.37 seconds to build instance.#033[00m
Nov 25 14:14:20 np0005535656 nova_compute[187219]: 2025-11-25 19:14:20.609 187223 DEBUG oslo_concurrency.lockutils [None req-9d3367fe-5dc5-4797-bc79-8f1a20b2e313 e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:21 np0005535656 nova_compute[187219]: 2025-11-25 19:14:21.483 187223 DEBUG nova.network.neutron [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updated VIF entry in instance network info cache for port 8cc08f3a-912b-45b3-9b53-ad9415a67906. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:14:21 np0005535656 nova_compute[187219]: 2025-11-25 19:14:21.484 187223 DEBUG nova.network.neutron [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updating instance_info_cache with network_info: [{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:14:21 np0005535656 nova_compute[187219]: 2025-11-25 19:14:21.501 187223 DEBUG oslo_concurrency.lockutils [req-228e66e4-008d-46b0-9425-34965286fba5 req-68ece25f-7f71-4df0-b67e-44c8fa6fb673 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.547 187223 DEBUG nova.compute.manager [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.547 187223 DEBUG oslo_concurrency.lockutils [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.548 187223 DEBUG oslo_concurrency.lockutils [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.548 187223 DEBUG oslo_concurrency.lockutils [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.549 187223 DEBUG nova.compute.manager [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.549 187223 WARNING nova.compute.manager [req-d4fa5347-4e01-40d5-b51f-0f3b405bef0a req-65bd3dee-75e1-4838-8fe0-423d25b73cab 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state None.#033[00m
Nov 25 14:14:22 np0005535656 nova_compute[187219]: 2025-11-25 19:14:22.628 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:23 np0005535656 nova_compute[187219]: 2025-11-25 19:14:23.288 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:27 np0005535656 nova_compute[187219]: 2025-11-25 19:14:27.650 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:27 np0005535656 podman[216589]: 2025-11-25 19:14:27.976223839 +0000 UTC m=+0.084260704 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 25 14:14:28 np0005535656 podman[216588]: 2025-11-25 19:14:28.020500068 +0000 UTC m=+0.133412035 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 25 14:14:28 np0005535656 nova_compute[187219]: 2025-11-25 19:14:28.290 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:32 np0005535656 nova_compute[187219]: 2025-11-25 19:14:32.249 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:32 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:32.248 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:14:32 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:32.250 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:14:32 np0005535656 nova_compute[187219]: 2025-11-25 19:14:32.652 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:33 np0005535656 nova_compute[187219]: 2025-11-25 19:14:33.293 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:33 np0005535656 podman[216649]: 2025-11-25 19:14:33.986797444 +0000 UTC m=+0.105692421 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git)
Nov 25 14:14:34 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:34Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:c0:fb 10.100.0.12
Nov 25 14:14:34 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:34Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:c0:fb 10.100.0.12
Nov 25 14:14:35 np0005535656 podman[197580]: time="2025-11-25T19:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:14:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:14:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3064 "" "Go-http-client/1.1"
Nov 25 14:14:37 np0005535656 nova_compute[187219]: 2025-11-25 19:14:37.693 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:37 np0005535656 podman[216670]: 2025-11-25 19:14:37.964554537 +0000 UTC m=+0.082158259 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 14:14:38 np0005535656 nova_compute[187219]: 2025-11-25 19:14:38.295 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:41 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:41.252 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:42 np0005535656 nova_compute[187219]: 2025-11-25 19:14:42.744 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:43 np0005535656 nova_compute[187219]: 2025-11-25 19:14:43.299 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:45 np0005535656 nova_compute[187219]: 2025-11-25 19:14:45.949 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Check if temp file /var/lib/nova/instances/tmpypvtksya exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:14:45 np0005535656 nova_compute[187219]: 2025-11-25 19:14:45.949 187223 DEBUG nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpypvtksya',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4e32bc34-e262-44f0-b382-e97dd53aa66c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:14:46 np0005535656 nova_compute[187219]: 2025-11-25 19:14:46.840 187223 DEBUG oslo_concurrency.processutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:46 np0005535656 nova_compute[187219]: 2025-11-25 19:14:46.893 187223 DEBUG oslo_concurrency.processutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:46 np0005535656 nova_compute[187219]: 2025-11-25 19:14:46.894 187223 DEBUG oslo_concurrency.processutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:14:46 np0005535656 nova_compute[187219]: 2025-11-25 19:14:46.948 187223 DEBUG oslo_concurrency.processutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:14:47 np0005535656 nova_compute[187219]: 2025-11-25 19:14:47.746 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:48 np0005535656 nova_compute[187219]: 2025-11-25 19:14:48.302 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:48 np0005535656 podman[216696]: 2025-11-25 19:14:48.957215754 +0000 UTC m=+0.076210359 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:14:49 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:14:49 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:14:49 np0005535656 systemd-logind[788]: New session 40 of user nova.
Nov 25 14:14:49 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:14:49 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:14:49 np0005535656 systemd[216724]: Queued start job for default target Main User Target.
Nov 25 14:14:49 np0005535656 systemd[216724]: Created slice User Application Slice.
Nov 25 14:14:49 np0005535656 systemd[216724]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:14:49 np0005535656 systemd[216724]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:14:49 np0005535656 systemd[216724]: Reached target Paths.
Nov 25 14:14:49 np0005535656 systemd[216724]: Reached target Timers.
Nov 25 14:14:49 np0005535656 systemd[216724]: Starting D-Bus User Message Bus Socket...
Nov 25 14:14:49 np0005535656 systemd[216724]: Starting Create User's Volatile Files and Directories...
Nov 25 14:14:49 np0005535656 systemd[216724]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:14:49 np0005535656 systemd[216724]: Reached target Sockets.
Nov 25 14:14:49 np0005535656 systemd[216724]: Finished Create User's Volatile Files and Directories.
Nov 25 14:14:49 np0005535656 systemd[216724]: Reached target Basic System.
Nov 25 14:14:49 np0005535656 systemd[216724]: Reached target Main User Target.
Nov 25 14:14:49 np0005535656 systemd[216724]: Startup finished in 128ms.
Nov 25 14:14:49 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:14:49 np0005535656 systemd[1]: Started Session 40 of User nova.
Nov 25 14:14:49 np0005535656 systemd[1]: session-40.scope: Deactivated successfully.
Nov 25 14:14:49 np0005535656 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Nov 25 14:14:49 np0005535656 systemd-logind[788]: Removed session 40.
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:14:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.533 187223 DEBUG nova.compute.manager [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.534 187223 DEBUG oslo_concurrency.lockutils [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.534 187223 DEBUG oslo_concurrency.lockutils [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.534 187223 DEBUG oslo_concurrency.lockutils [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.535 187223 DEBUG nova.compute.manager [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:51 np0005535656 nova_compute[187219]: 2025-11-25 19:14:51.535 187223 DEBUG nova.compute.manager [req-6eb85359-0cbd-4de7-b906-6d6f5d75080f req-b6e2271b-9c80-4888-8570-343c36e08183 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:14:52 np0005535656 nova_compute[187219]: 2025-11-25 19:14:52.774 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.304 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.667 187223 DEBUG nova.compute.manager [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.668 187223 DEBUG oslo_concurrency.lockutils [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.668 187223 DEBUG oslo_concurrency.lockutils [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.669 187223 DEBUG oslo_concurrency.lockutils [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.669 187223 DEBUG nova.compute.manager [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:53 np0005535656 nova_compute[187219]: 2025-11-25 19:14:53.670 187223 WARNING nova.compute.manager [req-d60a3f59-468f-4685-b1cc-f61d98da618d req-0f01e469-01f2-4fdd-9170-11fc40e51699 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.288 187223 INFO nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Took 7.34 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.288 187223 DEBUG nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.315 187223 DEBUG nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpypvtksya',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4e32bc34-e262-44f0-b382-e97dd53aa66c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(94b4e947-3fb9-41ee-8ab7-e4d4c80f1b55),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.337 187223 DEBUG nova.objects.instance [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lazy-loading 'migration_context' on Instance uuid 4e32bc34-e262-44f0-b382-e97dd53aa66c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.338 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.340 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.340 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.355 187223 DEBUG nova.virt.libvirt.vif [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-820432432',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-820432432',id=21,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:14:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-0vk8cqaj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:14:20Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4e32bc34-e262-44f0-b382-e97dd53aa66c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.355 187223 DEBUG nova.network.os_vif_util [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converting VIF {"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.356 187223 DEBUG nova.network.os_vif_util [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.356 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:14:54 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:80:c0:fb"/>
Nov 25 14:14:54 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:14:54 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:14:54 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:14:54 np0005535656 nova_compute[187219]:  <target dev="tap8cc08f3a-91"/>
Nov 25 14:14:54 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:14:54 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.357 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.843 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.844 187223 INFO nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:14:54 np0005535656 nova_compute[187219]: 2025-11-25 19:14:54.927 187223 INFO nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.431 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.432 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.764 187223 DEBUG nova.compute.manager [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-changed-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.764 187223 DEBUG nova.compute.manager [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Refreshing instance network info cache due to event network-changed-8cc08f3a-912b-45b3-9b53-ad9415a67906. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.765 187223 DEBUG oslo_concurrency.lockutils [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.765 187223 DEBUG oslo_concurrency.lockutils [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.766 187223 DEBUG nova.network.neutron [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Refreshing network info cache for port 8cc08f3a-912b-45b3-9b53-ad9415a67906 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.812 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.936 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:14:55 np0005535656 nova_compute[187219]: 2025-11-25 19:14:55.937 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.441 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.442 187223 DEBUG nova.virt.libvirt.migration [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.533 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098096.5330818, 4e32bc34-e262-44f0-b382-e97dd53aa66c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.533 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.555 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.559 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.587 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:14:56 np0005535656 kernel: tap8cc08f3a-91 (unregistering): left promiscuous mode
Nov 25 14:14:56 np0005535656 NetworkManager[55548]: <info>  [1764098096.6817] device (tap8cc08f3a-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:14:56 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:56Z|00160|binding|INFO|Releasing lport 8cc08f3a-912b-45b3-9b53-ad9415a67906 from this chassis (sb_readonly=0)
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.690 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:56Z|00161|binding|INFO|Setting lport 8cc08f3a-912b-45b3-9b53-ad9415a67906 down in Southbound
Nov 25 14:14:56 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:56Z|00162|binding|INFO|Removing iface tap8cc08f3a-91 ovn-installed in OVS
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.693 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.699 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:c0:fb 10.100.0.12'], port_security=['fa:16:3e:80:c0:fb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4e32bc34-e262-44f0-b382-e97dd53aa66c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=8cc08f3a-912b-45b3-9b53-ad9415a67906) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.700 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 8cc08f3a-912b-45b3-9b53-ad9415a67906 in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.701 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.702 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4d7ae6-61da-45d0-8462-54ddee3888bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.702 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.708 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 25 14:14:56 np0005535656 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000015.scope: Consumed 14.625s CPU time.
Nov 25 14:14:56 np0005535656 systemd-machined[153481]: Machine qemu-14-instance-00000015 terminated.
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [NOTICE]   (216577) : haproxy version is 2.8.14-c23fe91
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [NOTICE]   (216577) : path to executable is /usr/sbin/haproxy
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [WARNING]  (216577) : Exiting Master process...
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [WARNING]  (216577) : Exiting Master process...
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [ALERT]    (216577) : Current worker (216579) exited with code 143 (Terminated)
Nov 25 14:14:56 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[216573]: [WARNING]  (216577) : All workers exited. Exiting... (0)
Nov 25 14:14:56 np0005535656 systemd[1]: libpod-9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a.scope: Deactivated successfully.
Nov 25 14:14:56 np0005535656 podman[216777]: 2025-11-25 19:14:56.840129581 +0000 UTC m=+0.052035869 container died 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:14:56 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a-userdata-shm.mount: Deactivated successfully.
Nov 25 14:14:56 np0005535656 systemd[1]: var-lib-containers-storage-overlay-d745ab2bfd12587934e1a309cc0773a91c46f0110fc28ce788ddb97039b41d9f-merged.mount: Deactivated successfully.
Nov 25 14:14:56 np0005535656 podman[216777]: 2025-11-25 19:14:56.878711857 +0000 UTC m=+0.090618135 container cleanup 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:14:56 np0005535656 systemd[1]: libpod-conmon-9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a.scope: Deactivated successfully.
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.920 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.921 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.921 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:14:56 np0005535656 podman[216810]: 2025-11-25 19:14:56.942150252 +0000 UTC m=+0.041241500 container remove 9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.945 187223 DEBUG nova.virt.libvirt.guest [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4e32bc34-e262-44f0-b382-e97dd53aa66c' (instance-00000015) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.945 187223 INFO nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migration operation has completed#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.946 187223 INFO nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] _post_live_migration() is started..#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.950 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5557595d-08d0-4cd2-9ec6-ca4e5e7dbd81]: (4, ('Tue Nov 25 07:14:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a)\n9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a\nTue Nov 25 07:14:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a)\n9aef1f65cdac0b51bc9c6b79ff14aded4ad535ab703e6e3bc1f7fff79d18ca8a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.952 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[be7adac2-0484-43d7-9d7a-810284056a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.953 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.956 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.983 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 nova_compute[187219]: 2025-11-25 19:14:56.984 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:56 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:56.987 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2a44bd41-c7e8-41fb-aedd-37caf9294ab0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:57 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:57.012 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7794208f-4bf3-4697-8140-258b50e61934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:57 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:57.013 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[df545242-0265-40c4-b0ae-214c4d800021]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:57 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:57.026 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c41448ff-8d67-4222-ac9f-8638cdc4bd25]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502129, 'reachable_time': 26301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216840, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:57 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:57.029 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:14:57 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:57.029 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[2f67149f-f44a-4d46-b02e-f4ff8344211e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:14:57 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.281 187223 DEBUG nova.network.neutron [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updated VIF entry in instance network info cache for port 8cc08f3a-912b-45b3-9b53-ad9415a67906. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.281 187223 DEBUG nova.network.neutron [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Updating instance_info_cache with network_info: [{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.328 187223 DEBUG oslo_concurrency.lockutils [req-17785e98-842c-438b-901c-0a5a8ec095d3 req-9eda70e5-bce5-4e7c-bf89-4207f3c22029 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-4e32bc34-e262-44f0-b382-e97dd53aa66c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.639 187223 DEBUG nova.compute.manager [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.640 187223 DEBUG oslo_concurrency.lockutils [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.641 187223 DEBUG oslo_concurrency.lockutils [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.641 187223 DEBUG oslo_concurrency.lockutils [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.642 187223 DEBUG nova.compute.manager [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.642 187223 DEBUG nova.compute.manager [req-bfc4f7fb-9356-45c1-89dd-6cc8eca1ca51 req-01d33c5d-8d43-4b94-a921-24304ed97840 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.678 187223 DEBUG nova.network.neutron [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Activated binding for port 8cc08f3a-912b-45b3-9b53-ad9415a67906 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.679 187223 DEBUG nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.681 187223 DEBUG nova.virt.libvirt.vif [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:14:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-820432432',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-820432432',id=21,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:14:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-0vk8cqaj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:14:43Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=4e32bc34-e262-44f0-b382-e97dd53aa66c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.681 187223 DEBUG nova.network.os_vif_util [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converting VIF {"id": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "address": "fa:16:3e:80:c0:fb", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8cc08f3a-91", "ovs_interfaceid": "8cc08f3a-912b-45b3-9b53-ad9415a67906", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.683 187223 DEBUG nova.network.os_vif_util [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.684 187223 DEBUG os_vif [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.686 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.686 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cc08f3a-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.689 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.690 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.694 187223 INFO os_vif [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:c0:fb,bridge_name='br-int',has_traffic_filtering=True,id=8cc08f3a-912b-45b3-9b53-ad9415a67906,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8cc08f3a-91')#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.694 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.695 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.695 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.695 187223 DEBUG nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.695 187223 INFO nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Deleting instance files /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c_del#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.696 187223 INFO nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Deletion of /var/lib/nova/instances/4e32bc34-e262-44f0-b382-e97dd53aa66c_del complete#033[00m
Nov 25 14:14:57 np0005535656 nova_compute[187219]: 2025-11-25 19:14:57.776 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.362 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.363 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.363 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.364 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.364 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.364 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-unplugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.364 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.364 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.365 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.365 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.365 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.365 187223 WARNING nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.366 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.366 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.366 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.366 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.367 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.367 187223 WARNING nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.367 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.367 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.368 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.368 187223 DEBUG oslo_concurrency.lockutils [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.368 187223 DEBUG nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:14:58 np0005535656 nova_compute[187219]: 2025-11-25 19:14:58.368 187223 WARNING nova.compute.manager [req-0de85e3c-05c2-4276-840d-9a3a0a8b403a req-200fe9fe-6623-4b72-aeda-dbeb6e44db6f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:14:58 np0005535656 podman[216842]: 2025-11-25 19:14:58.945991685 +0000 UTC m=+0.054073194 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 14:14:58 np0005535656 podman[216841]: 2025-11-25 19:14:58.989308219 +0000 UTC m=+0.104131389 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 25 14:14:59 np0005535656 ovn_controller[95460]: 2025-11-25T19:14:59Z|00163|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 14:14:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:59.091 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:14:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:59.091 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:14:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:14:59.092 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:14:59 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:14:59 np0005535656 systemd[216724]: Activating special unit Exit the Session...
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped target Main User Target.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped target Basic System.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped target Paths.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped target Sockets.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped target Timers.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:14:59 np0005535656 systemd[216724]: Closed D-Bus User Message Bus Socket.
Nov 25 14:14:59 np0005535656 systemd[216724]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:14:59 np0005535656 systemd[216724]: Removed slice User Application Slice.
Nov 25 14:14:59 np0005535656 systemd[216724]: Reached target Shutdown.
Nov 25 14:14:59 np0005535656 systemd[216724]: Finished Exit the Session.
Nov 25 14:14:59 np0005535656 systemd[216724]: Reached target Exit the Session.
Nov 25 14:14:59 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:14:59 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:14:59 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:14:59 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:14:59 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:14:59 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:14:59 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.435 187223 DEBUG nova.compute.manager [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.435 187223 DEBUG oslo_concurrency.lockutils [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.435 187223 DEBUG oslo_concurrency.lockutils [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.436 187223 DEBUG oslo_concurrency.lockutils [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.436 187223 DEBUG nova.compute.manager [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] No waiting events found dispatching network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:15:00 np0005535656 nova_compute[187219]: 2025-11-25 19:15:00.436 187223 WARNING nova.compute.manager [req-62f7434a-5736-42ac-ab74-b6a71137b565 req-314d4ac2-824d-412f-bf84-7d4b917f6e44 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Received unexpected event network-vif-plugged-8cc08f3a-912b-45b3-9b53-ad9415a67906 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.511 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.511 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.512 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "4e32bc34-e262-44f0-b382-e97dd53aa66c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.549 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.549 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.550 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.550 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.690 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.778 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.781 187223 WARNING nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.782 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5867MB free_disk=73.1629409790039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.782 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.783 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.834 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Migration for instance 4e32bc34-e262-44f0-b382-e97dd53aa66c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.869 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.909 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Migration 94b4e947-3fb9-41ee-8ab7-e4d4c80f1b55 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.909 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.910 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.957 187223 DEBUG nova.compute.provider_tree [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:15:02 np0005535656 nova_compute[187219]: 2025-11-25 19:15:02.981 187223 DEBUG nova.scheduler.client.report [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:15:03 np0005535656 nova_compute[187219]: 2025-11-25 19:15:03.012 187223 DEBUG nova.compute.resource_tracker [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:15:03 np0005535656 nova_compute[187219]: 2025-11-25 19:15:03.013 187223 DEBUG oslo_concurrency.lockutils [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:03 np0005535656 nova_compute[187219]: 2025-11-25 19:15:03.022 187223 INFO nova.compute.manager [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:15:03 np0005535656 nova_compute[187219]: 2025-11-25 19:15:03.124 187223 INFO nova.scheduler.client.report [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Deleted allocation for migration 94b4e947-3fb9-41ee-8ab7-e4d4c80f1b55#033[00m
Nov 25 14:15:03 np0005535656 nova_compute[187219]: 2025-11-25 19:15:03.125 187223 DEBUG nova.virt.libvirt.driver [None req-99f23efc-5ca1-4370-8a03-68449dadf2b8 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:15:04 np0005535656 podman[216891]: 2025-11-25 19:15:04.97666027 +0000 UTC m=+0.088698274 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 25 14:15:05 np0005535656 podman[197580]: time="2025-11-25T19:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:15:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:15:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 25 14:15:06 np0005535656 nova_compute[187219]: 2025-11-25 19:15:06.698 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:06 np0005535656 nova_compute[187219]: 2025-11-25 19:15:06.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:15:06 np0005535656 nova_compute[187219]: 2025-11-25 19:15:06.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:15:06 np0005535656 nova_compute[187219]: 2025-11-25 19:15:06.720 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:15:07 np0005535656 nova_compute[187219]: 2025-11-25 19:15:07.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:07 np0005535656 nova_compute[187219]: 2025-11-25 19:15:07.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:07 np0005535656 nova_compute[187219]: 2025-11-25 19:15:07.692 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:07 np0005535656 nova_compute[187219]: 2025-11-25 19:15:07.779 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:08 np0005535656 podman[216914]: 2025-11-25 19:15:08.938181326 +0000 UTC m=+0.061209465 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.918 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098096.9173663, 4e32bc34-e262-44f0-b382-e97dd53aa66c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.918 187223 INFO nova.compute.manager [-] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:15:11 np0005535656 nova_compute[187219]: 2025-11-25 19:15:11.941 187223 DEBUG nova.compute.manager [None req-9cc5e5e0-f82d-4cc6-9a7a-573fbb0d39e2 - - - - - -] [instance: 4e32bc34-e262-44f0-b382-e97dd53aa66c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:15:12 np0005535656 nova_compute[187219]: 2025-11-25 19:15:12.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:12 np0005535656 nova_compute[187219]: 2025-11-25 19:15:12.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 14:15:12 np0005535656 nova_compute[187219]: 2025-11-25 19:15:12.695 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:12 np0005535656 nova_compute[187219]: 2025-11-25 19:15:12.780 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:13 np0005535656 nova_compute[187219]: 2025-11-25 19:15:13.692 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.669 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.695 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.725 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.726 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.726 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.727 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.937 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.938 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5886MB free_disk=73.1629409790039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.939 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:14 np0005535656 nova_compute[187219]: 2025-11-25 19:15:14.939 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.233 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.234 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.306 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.329 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.332 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:15:15 np0005535656 nova_compute[187219]: 2025-11-25 19:15:15.332 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:15:16 np0005535656 nova_compute[187219]: 2025-11-25 19:15:16.310 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:16 np0005535656 nova_compute[187219]: 2025-11-25 19:15:16.310 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:15:17 np0005535656 nova_compute[187219]: 2025-11-25 19:15:17.697 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:17 np0005535656 nova_compute[187219]: 2025-11-25 19:15:17.783 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:15:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:15:19 np0005535656 podman[216936]: 2025-11-25 19:15:19.970712076 +0000 UTC m=+0.081538371 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:15:21 np0005535656 nova_compute[187219]: 2025-11-25 19:15:21.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:22 np0005535656 nova_compute[187219]: 2025-11-25 19:15:22.699 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:22 np0005535656 nova_compute[187219]: 2025-11-25 19:15:22.785 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:23 np0005535656 nova_compute[187219]: 2025-11-25 19:15:23.704 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:23 np0005535656 nova_compute[187219]: 2025-11-25 19:15:23.705 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 14:15:23 np0005535656 nova_compute[187219]: 2025-11-25 19:15:23.722 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 14:15:27 np0005535656 nova_compute[187219]: 2025-11-25 19:15:27.729 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:27 np0005535656 nova_compute[187219]: 2025-11-25 19:15:27.787 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:29 np0005535656 podman[216961]: 2025-11-25 19:15:29.953339069 +0000 UTC m=+0.069215791 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true)
Nov 25 14:15:29 np0005535656 podman[216960]: 2025-11-25 19:15:29.978233597 +0000 UTC m=+0.101080926 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 14:15:31 np0005535656 nova_compute[187219]: 2025-11-25 19:15:31.811 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:15:32 np0005535656 nova_compute[187219]: 2025-11-25 19:15:32.732 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:32 np0005535656 nova_compute[187219]: 2025-11-25 19:15:32.790 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:35 np0005535656 podman[197580]: time="2025-11-25T19:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:15:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:15:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Nov 25 14:15:35 np0005535656 podman[217003]: 2025-11-25 19:15:35.980149131 +0000 UTC m=+0.094422458 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 25 14:15:37 np0005535656 nova_compute[187219]: 2025-11-25 19:15:37.768 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:37 np0005535656 nova_compute[187219]: 2025-11-25 19:15:37.791 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:39 np0005535656 podman[217025]: 2025-11-25 19:15:39.936125137 +0000 UTC m=+0.056375335 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:15:42 np0005535656 nova_compute[187219]: 2025-11-25 19:15:42.771 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:42 np0005535656 ovn_controller[95460]: 2025-11-25T19:15:42Z|00164|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Nov 25 14:15:42 np0005535656 nova_compute[187219]: 2025-11-25 19:15:42.794 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:47 np0005535656 nova_compute[187219]: 2025-11-25 19:15:47.773 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:47 np0005535656 nova_compute[187219]: 2025-11-25 19:15:47.797 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:15:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:15:50 np0005535656 podman[217045]: 2025-11-25 19:15:50.946505052 +0000 UTC m=+0.066409285 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:15:52 np0005535656 nova_compute[187219]: 2025-11-25 19:15:52.775 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:52 np0005535656 nova_compute[187219]: 2025-11-25 19:15:52.798 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:57 np0005535656 nova_compute[187219]: 2025-11-25 19:15:57.777 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:57 np0005535656 nova_compute[187219]: 2025-11-25 19:15:57.800 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:15:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:15:59.092 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:15:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:15:59.093 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:15:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:15:59.093 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:00 np0005535656 podman[217071]: 2025-11-25 19:16:00.964659409 +0000 UTC m=+0.072937120 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:16:00 np0005535656 podman[217070]: 2025-11-25 19:16:00.990616997 +0000 UTC m=+0.108964538 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 25 14:16:02 np0005535656 nova_compute[187219]: 2025-11-25 19:16:02.780 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:02 np0005535656 nova_compute[187219]: 2025-11-25 19:16:02.802 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:05 np0005535656 podman[197580]: time="2025-11-25T19:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:16:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:16:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 25 14:16:06 np0005535656 nova_compute[187219]: 2025-11-25 19:16:06.691 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:06 np0005535656 nova_compute[187219]: 2025-11-25 19:16:06.692 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:16:06 np0005535656 nova_compute[187219]: 2025-11-25 19:16:06.692 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:16:06 np0005535656 nova_compute[187219]: 2025-11-25 19:16:06.721 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:16:06 np0005535656 podman[217116]: 2025-11-25 19:16:06.963847278 +0000 UTC m=+0.082426216 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6)
Nov 25 14:16:07 np0005535656 nova_compute[187219]: 2025-11-25 19:16:07.783 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:07 np0005535656 nova_compute[187219]: 2025-11-25 19:16:07.804 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:08 np0005535656 nova_compute[187219]: 2025-11-25 19:16:08.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:08 np0005535656 nova_compute[187219]: 2025-11-25 19:16:08.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:08 np0005535656 nova_compute[187219]: 2025-11-25 19:16:08.992 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:08 np0005535656 nova_compute[187219]: 2025-11-25 19:16:08.993 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.027 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.155 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.156 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.167 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.167 187223 INFO nova.compute.claims [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.334 187223 DEBUG nova.compute.provider_tree [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.351 187223 DEBUG nova.scheduler.client.report [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.385 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.386 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.459 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.460 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.495 187223 INFO nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.523 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.651 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.653 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.654 187223 INFO nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Creating image(s)#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.655 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.655 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.657 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.683 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.757 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.759 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.759 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.776 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.841 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.842 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.890 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.892 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.892 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.945 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.946 187223 DEBUG nova.virt.disk.api [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Checking if we can resize image /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:16:09 np0005535656 nova_compute[187219]: 2025-11-25 19:16:09.947 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.025 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.026 187223 DEBUG nova.virt.disk.api [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Cannot resize image /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.027 187223 DEBUG nova.objects.instance [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'migration_context' on Instance uuid bbf75eb3-0515-4610-a8c5-d8999a111b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.066 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.066 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Ensure instance console log exists: /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.067 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.067 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.067 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:10 np0005535656 nova_compute[187219]: 2025-11-25 19:16:10.370 187223 DEBUG nova.policy [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e60aa8a36ef94fa186a5c8de1df9e594', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3670f92d82410b981d159346c0c038', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:16:10 np0005535656 podman[217153]: 2025-11-25 19:16:10.986885966 +0000 UTC m=+0.098908307 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 14:16:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:11.341 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:16:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:11.342 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:16:11 np0005535656 nova_compute[187219]: 2025-11-25 19:16:11.382 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:11 np0005535656 nova_compute[187219]: 2025-11-25 19:16:11.452 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Successfully created port: 6d0fcfbf-d5de-4b58-9223-ed19141e11fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:16:11 np0005535656 nova_compute[187219]: 2025-11-25 19:16:11.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.438 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Successfully updated port: 6d0fcfbf-d5de-4b58-9223-ed19141e11fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.480 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.481 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquired lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.481 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.547 187223 DEBUG nova.compute.manager [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-changed-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.548 187223 DEBUG nova.compute.manager [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Refreshing instance network info cache due to event network-changed-6d0fcfbf-d5de-4b58-9223-ed19141e11fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.548 187223 DEBUG oslo_concurrency.lockutils [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.677 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.785 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:12 np0005535656 nova_compute[187219]: 2025-11-25 19:16:12.807 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.319 187223 DEBUG nova.network.neutron [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating instance_info_cache with network_info: [{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:16:14 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:14.344 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.347 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Releasing lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.348 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Instance network_info: |[{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.349 187223 DEBUG oslo_concurrency.lockutils [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.349 187223 DEBUG nova.network.neutron [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Refreshing network info cache for port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.354 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Start _get_guest_xml network_info=[{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.361 187223 WARNING nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.371 187223 DEBUG nova.virt.libvirt.host [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.372 187223 DEBUG nova.virt.libvirt.host [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.379 187223 DEBUG nova.virt.libvirt.host [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.380 187223 DEBUG nova.virt.libvirt.host [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.382 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.383 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.383 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.384 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.384 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.385 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.385 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.386 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.386 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.387 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.387 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.387 187223 DEBUG nova.virt.hardware [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.394 187223 DEBUG nova.virt.libvirt.vif [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1209546672',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1209546672',id=23,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-xxg3mc0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:16:09Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=bbf75eb3-0515-4610-a8c5-d8999a111b47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.395 187223 DEBUG nova.network.os_vif_util [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.396 187223 DEBUG nova.network.os_vif_util [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.397 187223 DEBUG nova.objects.instance [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lazy-loading 'pci_devices' on Instance uuid bbf75eb3-0515-4610-a8c5-d8999a111b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.417 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <uuid>bbf75eb3-0515-4610-a8c5-d8999a111b47</uuid>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <name>instance-00000017</name>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteStrategies-server-1209546672</nova:name>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:16:14</nova:creationTime>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:user uuid="e60aa8a36ef94fa186a5c8de1df9e594">tempest-TestExecuteStrategies-2025590332-project-member</nova:user>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:project uuid="ab3670f92d82410b981d159346c0c038">tempest-TestExecuteStrategies-2025590332</nova:project>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        <nova:port uuid="6d0fcfbf-d5de-4b58-9223-ed19141e11fb">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="serial">bbf75eb3-0515-4610-a8c5-d8999a111b47</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="uuid">bbf75eb3-0515-4610-a8c5-d8999a111b47</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.config"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:ef:15:92"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <target dev="tap6d0fcfbf-d5"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/console.log" append="off"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:16:14 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:16:14 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:16:14 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:16:14 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.420 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Preparing to wait for external event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.420 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.421 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.421 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.422 187223 DEBUG nova.virt.libvirt.vif [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1209546672',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1209546672',id=23,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-xxg3mc0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:16:09Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=bbf75eb3-0515-4610-a8c5-d8999a111b47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.423 187223 DEBUG nova.network.os_vif_util [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converting VIF {"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.424 187223 DEBUG nova.network.os_vif_util [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.424 187223 DEBUG os_vif [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.425 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.426 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.426 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.430 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.431 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d0fcfbf-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.431 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d0fcfbf-d5, col_values=(('external_ids', {'iface-id': '6d0fcfbf-d5de-4b58-9223-ed19141e11fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:15:92', 'vm-uuid': 'bbf75eb3-0515-4610-a8c5-d8999a111b47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.432 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:14 np0005535656 NetworkManager[55548]: <info>  [1764098174.4336] manager: (tap6d0fcfbf-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.435 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.438 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.438 187223 INFO os_vif [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5')#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.494 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.495 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.495 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] No VIF found with MAC fa:16:3e:ef:15:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.495 187223 INFO nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Using config drive#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.696 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.697 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.697 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.698 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.761 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.856 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.857 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.945 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:14 np0005535656 nova_compute[187219]: 2025-11-25 19:16:14.947 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000017, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.config'#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.148 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.150 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5871MB free_disk=73.16270065307617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.150 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.150 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.235 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance bbf75eb3-0515-4610-a8c5-d8999a111b47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.235 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.236 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.481 187223 INFO nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Creating config drive at /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.config#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.490 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv1tks4ie execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.527 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.574 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.611 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.611 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.615 187223 DEBUG oslo_concurrency.processutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv1tks4ie" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:16:15 np0005535656 kernel: tap6d0fcfbf-d5: entered promiscuous mode
Nov 25 14:16:15 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:15Z|00165|binding|INFO|Claiming lport 6d0fcfbf-d5de-4b58-9223-ed19141e11fb for this chassis.
Nov 25 14:16:15 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:15Z|00166|binding|INFO|6d0fcfbf-d5de-4b58-9223-ed19141e11fb: Claiming fa:16:3e:ef:15:92 10.100.0.10
Nov 25 14:16:15 np0005535656 NetworkManager[55548]: <info>  [1764098175.7173] manager: (tap6d0fcfbf-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.715 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.726 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:15:92 10.100.0.10'], port_security=['fa:16:3e:ef:15:92 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bbf75eb3-0515-4610-a8c5-d8999a111b47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=6d0fcfbf-d5de-4b58-9223-ed19141e11fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.728 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 bound to our chassis#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.731 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891#033[00m
Nov 25 14:16:15 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:15Z|00167|binding|INFO|Setting lport 6d0fcfbf-d5de-4b58-9223-ed19141e11fb ovn-installed in OVS
Nov 25 14:16:15 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:15Z|00168|binding|INFO|Setting lport 6d0fcfbf-d5de-4b58-9223-ed19141e11fb up in Southbound
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.735 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.738 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.745 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[472f0363-e689-49db-a119-254196fd82b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.746 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e881e87-b1 in ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.748 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e881e87-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.749 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[cf943f10-dacf-44d3-ad52-acc5db6e2053]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.750 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2ff149-ede9-4d8f-ae46-ddda8c9324c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.766 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[0de433a1-f028-4df5-98e2-93c93cc35900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 systemd-udevd[217202]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:16:15 np0005535656 systemd-machined[153481]: New machine qemu-15-instance-00000017.
Nov 25 14:16:15 np0005535656 NetworkManager[55548]: <info>  [1764098175.7822] device (tap6d0fcfbf-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:16:15 np0005535656 NetworkManager[55548]: <info>  [1764098175.7838] device (tap6d0fcfbf-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.789 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[303e020a-c4fe-4093-a70e-7d392d1c1fa1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 systemd[1]: Started Virtual Machine qemu-15-instance-00000017.
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.828 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[bbac4927-274d-453b-9b09-72ad1d025a7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.835 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a0eb9f1b-f67f-4860-b35d-9f65961f9eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 NetworkManager[55548]: <info>  [1764098175.8368] manager: (tap8e881e87-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.862 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9a8d36-87b2-4d64-80b7-fbbb70cdce8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.867 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[2205eccc-4e65-412e-aef6-3573cdc0edfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 NetworkManager[55548]: <info>  [1764098175.8916] device (tap8e881e87-b0): carrier: link connected
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.896 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f04774-ce9b-4397-80f1-903234194a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.913 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd59fe3-b5cb-42fb-83c0-9b6d2d5fe1a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513741, 'reachable_time': 26610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217234, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.927 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e1f7bc-4764-41f2-baa4-9938b783399f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6d5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513741, 'tstamp': 513741}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217235, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.945 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[fca22f69-c3e0-4bee-a4c0-b913551406ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e881e87-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:6d:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513741, 'reachable_time': 26610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217236, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:15.980 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a43393-4bd5-4160-ac88-2c7ffe77a315]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.994 187223 DEBUG nova.compute.manager [req-99216c04-e029-4f59-8d65-39575747c442 req-68b3f678-d121-4adf-abd1-66c686590ffe 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.994 187223 DEBUG oslo_concurrency.lockutils [req-99216c04-e029-4f59-8d65-39575747c442 req-68b3f678-d121-4adf-abd1-66c686590ffe 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.995 187223 DEBUG oslo_concurrency.lockutils [req-99216c04-e029-4f59-8d65-39575747c442 req-68b3f678-d121-4adf-abd1-66c686590ffe 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.995 187223 DEBUG oslo_concurrency.lockutils [req-99216c04-e029-4f59-8d65-39575747c442 req-68b3f678-d121-4adf-abd1-66c686590ffe 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:15 np0005535656 nova_compute[187219]: 2025-11-25 19:16:15.995 187223 DEBUG nova.compute.manager [req-99216c04-e029-4f59-8d65-39575747c442 req-68b3f678-d121-4adf-abd1-66c686590ffe 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Processing event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.056 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[579f8a3a-d100-46da-911b-651fa899d3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.058 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.058 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.059 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e881e87-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:16 np0005535656 NetworkManager[55548]: <info>  [1764098176.0617] manager: (tap8e881e87-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 25 14:16:16 np0005535656 kernel: tap8e881e87-b0: entered promiscuous mode
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.065 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e881e87-b0, col_values=(('external_ids', {'iface-id': 'f01fca37-0f9e-4574-bd34-7de06647d521'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:16:16 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:16Z|00169|binding|INFO|Releasing lport f01fca37-0f9e-4574-bd34-7de06647d521 from this chassis (sb_readonly=0)
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.073 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.081 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.082 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.083 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a244b8-98ba-480d-8eac-28568d953163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.084 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/8e881e87-b103-4ad8-8de5-f8f4f0a10891.pid.haproxy
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 8e881e87-b103-4ad8-8de5-f8f4f0a10891
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:16:16 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:16.085 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'env', 'PROCESS_TAG=haproxy-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e881e87-b103-4ad8-8de5-f8f4f0a10891.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.186 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.187 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098176.1874764, bbf75eb3-0515-4610-a8c5-d8999a111b47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.188 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] VM Started (Lifecycle Event)#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.198 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.202 187223 INFO nova.virt.libvirt.driver [-] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Instance spawned successfully.#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.202 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.352 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.359 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.359 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.359 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.360 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.360 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.360 187223 DEBUG nova.virt.libvirt.driver [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.365 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.413 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.414 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098176.1900828, bbf75eb3-0515-4610-a8c5-d8999a111b47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.414 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.428 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.432 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098176.1915383, bbf75eb3-0515-4610-a8c5-d8999a111b47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.432 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.437 187223 INFO nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Took 6.79 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.437 187223 DEBUG nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.458 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.461 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.487 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:16:16 np0005535656 podman[217275]: 2025-11-25 19:16:16.489303349 +0000 UTC m=+0.082770984 container create 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.495 187223 INFO nova.compute.manager [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Took 7.41 seconds to build instance.#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.512 187223 DEBUG oslo_concurrency.lockutils [None req-248e2811-b493-41c9-83c8-ee4e51609ffb e60aa8a36ef94fa186a5c8de1df9e594 ab3670f92d82410b981d159346c0c038 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:16 np0005535656 podman[217275]: 2025-11-25 19:16:16.435388481 +0000 UTC m=+0.028856126 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:16:16 np0005535656 systemd[1]: Started libpod-conmon-21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1.scope.
Nov 25 14:16:16 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:16:16 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f09ef4e80801c04a4e2e73d28b1fa6ca390ce4ebf86b5858e82d1d42ca430dd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:16:16 np0005535656 podman[217275]: 2025-11-25 19:16:16.591627728 +0000 UTC m=+0.185095413 container init 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 25 14:16:16 np0005535656 podman[217275]: 2025-11-25 19:16:16.597472285 +0000 UTC m=+0.190939930 container start 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.608 187223 DEBUG nova.network.neutron [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updated VIF entry in instance network info cache for port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.608 187223 DEBUG nova.network.neutron [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating instance_info_cache with network_info: [{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.613 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:16 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [NOTICE]   (217294) : New worker (217296) forked
Nov 25 14:16:16 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [NOTICE]   (217294) : Loading success.
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.635 187223 DEBUG oslo_concurrency.lockutils [req-5625737c-e5bc-4f4e-96e6-8782d3f8353b req-91a66f8a-cf19-4843-bf4d-ef58d6f7fc35 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:16:16 np0005535656 nova_compute[187219]: 2025-11-25 19:16:16.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:16:17 np0005535656 nova_compute[187219]: 2025-11-25 19:16:17.814 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.347 187223 DEBUG nova.compute.manager [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.347 187223 DEBUG oslo_concurrency.lockutils [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.347 187223 DEBUG oslo_concurrency.lockutils [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.347 187223 DEBUG oslo_concurrency.lockutils [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.348 187223 DEBUG nova.compute.manager [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:16:18 np0005535656 nova_compute[187219]: 2025-11-25 19:16:18.348 187223 WARNING nova.compute.manager [req-dc7c6a22-433c-4a9b-8758-45d78ac4668d req-31e22dc5-243f-4c01-baeb-46953edaf07f 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state None.#033[00m
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:16:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:16:19 np0005535656 nova_compute[187219]: 2025-11-25 19:16:19.441 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:21 np0005535656 podman[217305]: 2025-11-25 19:16:21.984178201 +0000 UTC m=+0.088666324 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:16:22 np0005535656 nova_compute[187219]: 2025-11-25 19:16:22.816 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:24 np0005535656 nova_compute[187219]: 2025-11-25 19:16:24.443 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:27 np0005535656 nova_compute[187219]: 2025-11-25 19:16:27.819 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:29 np0005535656 nova_compute[187219]: 2025-11-25 19:16:29.445 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:29 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:29Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:15:92 10.100.0.10
Nov 25 14:16:29 np0005535656 ovn_controller[95460]: 2025-11-25T19:16:29Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:15:92 10.100.0.10
Nov 25 14:16:31 np0005535656 podman[217347]: 2025-11-25 19:16:31.996330716 +0000 UTC m=+0.110645894 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 25 14:16:32 np0005535656 podman[217348]: 2025-11-25 19:16:32.010401354 +0000 UTC m=+0.110767857 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:16:32 np0005535656 nova_compute[187219]: 2025-11-25 19:16:32.822 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:34 np0005535656 nova_compute[187219]: 2025-11-25 19:16:34.447 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:35 np0005535656 podman[197580]: time="2025-11-25T19:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:16:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:16:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 25 14:16:37 np0005535656 nova_compute[187219]: 2025-11-25 19:16:37.824 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:37 np0005535656 podman[217394]: 2025-11-25 19:16:37.96403558 +0000 UTC m=+0.073515377 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 14:16:39 np0005535656 nova_compute[187219]: 2025-11-25 19:16:39.451 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:41 np0005535656 podman[217416]: 2025-11-25 19:16:41.956084776 +0000 UTC m=+0.074676248 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 14:16:42 np0005535656 nova_compute[187219]: 2025-11-25 19:16:42.828 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:44 np0005535656 nova_compute[187219]: 2025-11-25 19:16:44.453 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:47 np0005535656 nova_compute[187219]: 2025-11-25 19:16:47.830 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:16:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:16:49 np0005535656 nova_compute[187219]: 2025-11-25 19:16:49.456 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:52 np0005535656 nova_compute[187219]: 2025-11-25 19:16:52.833 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:52 np0005535656 podman[217436]: 2025-11-25 19:16:52.968906074 +0000 UTC m=+0.083502693 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:16:54 np0005535656 nova_compute[187219]: 2025-11-25 19:16:54.458 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:57 np0005535656 nova_compute[187219]: 2025-11-25 19:16:57.834 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:16:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:59.093 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:16:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:59.093 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:16:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:16:59.094 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:16:59 np0005535656 nova_compute[187219]: 2025-11-25 19:16:59.461 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:02 np0005535656 ovn_controller[95460]: 2025-11-25T19:17:02Z|00170|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Nov 25 14:17:02 np0005535656 nova_compute[187219]: 2025-11-25 19:17:02.868 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:02 np0005535656 podman[217462]: 2025-11-25 19:17:02.982265844 +0000 UTC m=+0.072232222 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 25 14:17:02 np0005535656 podman[217461]: 2025-11-25 19:17:02.986543859 +0000 UTC m=+0.086429663 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:17:04 np0005535656 nova_compute[187219]: 2025-11-25 19:17:04.465 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:05 np0005535656 podman[197580]: time="2025-11-25T19:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:17:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:17:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.875 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.875 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.875 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:17:06 np0005535656 nova_compute[187219]: 2025-11-25 19:17:06.876 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid bbf75eb3-0515-4610-a8c5-d8999a111b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:17:07 np0005535656 nova_compute[187219]: 2025-11-25 19:17:07.871 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:08 np0005535656 nova_compute[187219]: 2025-11-25 19:17:08.763 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating instance_info_cache with network_info: [{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:17:08 np0005535656 nova_compute[187219]: 2025-11-25 19:17:08.785 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:17:08 np0005535656 nova_compute[187219]: 2025-11-25 19:17:08.785 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:17:08 np0005535656 podman[217506]: 2025-11-25 19:17:08.951399946 +0000 UTC m=+0.073780314 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm)
Nov 25 14:17:09 np0005535656 nova_compute[187219]: 2025-11-25 19:17:09.468 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:10 np0005535656 nova_compute[187219]: 2025-11-25 19:17:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:10 np0005535656 nova_compute[187219]: 2025-11-25 19:17:10.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:11 np0005535656 nova_compute[187219]: 2025-11-25 19:17:11.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:12 np0005535656 nova_compute[187219]: 2025-11-25 19:17:12.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:12 np0005535656 nova_compute[187219]: 2025-11-25 19:17:12.911 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:13 np0005535656 podman[217529]: 2025-11-25 19:17:13.002346885 +0000 UTC m=+0.078669434 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:17:14 np0005535656 nova_compute[187219]: 2025-11-25 19:17:14.471 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:14 np0005535656 nova_compute[187219]: 2025-11-25 19:17:14.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.259 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Check if temp file /var/lib/nova/instances/tmp7rp0vngp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.260 187223 DEBUG nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7rp0vngp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbf75eb3-0515-4610-a8c5-d8999a111b47',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.683 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.705 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.705 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.705 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.794 187223 DEBUG oslo_concurrency.processutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.817 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.857 187223 DEBUG oslo_concurrency.processutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.859 187223 DEBUG oslo_concurrency.processutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.880 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.881 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.935 187223 DEBUG oslo_concurrency.processutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:17:15 np0005535656 nova_compute[187219]: 2025-11-25 19:17:15.943 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.100 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.101 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.13419723510742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.101 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.102 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.166 187223 INFO nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating resource usage from migration 12e2895c-c42b-493d-91f0-2b76bf51b3c9#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.202 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Migration 12e2895c-c42b-493d-91f0-2b76bf51b3c9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.202 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.202 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.252 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.264 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.289 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:17:16 np0005535656 nova_compute[187219]: 2025-11-25 19:17:16.289 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:17 np0005535656 nova_compute[187219]: 2025-11-25 19:17:17.278 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:17 np0005535656 nova_compute[187219]: 2025-11-25 19:17:17.279 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:17:17 np0005535656 nova_compute[187219]: 2025-11-25 19:17:17.279 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:17:17 np0005535656 nova_compute[187219]: 2025-11-25 19:17:17.913 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:18 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:17:18 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:17:18 np0005535656 systemd-logind[788]: New session 42 of user nova.
Nov 25 14:17:18 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:17:18 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:17:18 np0005535656 systemd[217568]: Queued start job for default target Main User Target.
Nov 25 14:17:18 np0005535656 systemd[217568]: Created slice User Application Slice.
Nov 25 14:17:18 np0005535656 systemd[217568]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:17:18 np0005535656 systemd[217568]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:17:18 np0005535656 systemd[217568]: Reached target Paths.
Nov 25 14:17:18 np0005535656 systemd[217568]: Reached target Timers.
Nov 25 14:17:18 np0005535656 systemd[217568]: Starting D-Bus User Message Bus Socket...
Nov 25 14:17:18 np0005535656 systemd[217568]: Starting Create User's Volatile Files and Directories...
Nov 25 14:17:18 np0005535656 systemd[217568]: Finished Create User's Volatile Files and Directories.
Nov 25 14:17:18 np0005535656 systemd[217568]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:17:18 np0005535656 systemd[217568]: Reached target Sockets.
Nov 25 14:17:18 np0005535656 systemd[217568]: Reached target Basic System.
Nov 25 14:17:18 np0005535656 systemd[217568]: Reached target Main User Target.
Nov 25 14:17:18 np0005535656 systemd[217568]: Startup finished in 152ms.
Nov 25 14:17:18 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:17:18 np0005535656 systemd[1]: Started Session 42 of User nova.
Nov 25 14:17:18 np0005535656 systemd[1]: session-42.scope: Deactivated successfully.
Nov 25 14:17:18 np0005535656 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Nov 25 14:17:18 np0005535656 systemd-logind[788]: Removed session 42.
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:17:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.473 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.680 187223 DEBUG nova.compute.manager [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.680 187223 DEBUG oslo_concurrency.lockutils [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.681 187223 DEBUG oslo_concurrency.lockutils [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.681 187223 DEBUG oslo_concurrency.lockutils [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.681 187223 DEBUG nova.compute.manager [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:19 np0005535656 nova_compute[187219]: 2025-11-25 19:17:19.681 187223 DEBUG nova.compute.manager [req-4fcf1394-ca27-49b7-815f-983204892c7b req-fe57b513-fe31-4301-bce3-2ccc1d341981 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:17:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:20.371 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:17:20 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:20.372 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:17:20 np0005535656 nova_compute[187219]: 2025-11-25 19:17:20.417 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.412 187223 INFO nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Took 5.48 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.413 187223 DEBUG nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.431 187223 DEBUG nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7rp0vngp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbf75eb3-0515-4610-a8c5-d8999a111b47',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(12e2895c-c42b-493d-91f0-2b76bf51b3c9),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.452 187223 DEBUG nova.objects.instance [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid bbf75eb3-0515-4610-a8c5-d8999a111b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.454 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.457 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.457 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.472 187223 DEBUG nova.virt.libvirt.vif [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1209546672',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1209546672',id=23,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:16:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-xxg3mc0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:16:16Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=bbf75eb3-0515-4610-a8c5-d8999a111b47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.473 187223 DEBUG nova.network.os_vif_util [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.474 187223 DEBUG nova.network.os_vif_util [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.475 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:17:21 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:ef:15:92"/>
Nov 25 14:17:21 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:17:21 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:17:21 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:17:21 np0005535656 nova_compute[187219]:  <target dev="tap6d0fcfbf-d5"/>
Nov 25 14:17:21 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:17:21 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.476 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.832 187223 DEBUG nova.compute.manager [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.833 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.834 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.834 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.835 187223 DEBUG nova.compute.manager [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.835 187223 WARNING nova.compute.manager [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.835 187223 DEBUG nova.compute.manager [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-changed-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.836 187223 DEBUG nova.compute.manager [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Refreshing instance network info cache due to event network-changed-6d0fcfbf-d5de-4b58-9223-ed19141e11fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.836 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.836 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.837 187223 DEBUG nova.network.neutron [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Refreshing network info cache for port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.960 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:17:21 np0005535656 nova_compute[187219]: 2025-11-25 19:17:21.960 187223 INFO nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:17:22 np0005535656 nova_compute[187219]: 2025-11-25 19:17:22.041 187223 INFO nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:17:22 np0005535656 nova_compute[187219]: 2025-11-25 19:17:22.545 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:17:22 np0005535656 nova_compute[187219]: 2025-11-25 19:17:22.545 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:17:22 np0005535656 nova_compute[187219]: 2025-11-25 19:17:22.978 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.048 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.049 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.514 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098243.5138087, bbf75eb3-0515-4610-a8c5-d8999a111b47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.515 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.543 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.547 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.552 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.553 187223 DEBUG nova.virt.libvirt.migration [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.570 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:17:23 np0005535656 kernel: tap6d0fcfbf-d5 (unregistering): left promiscuous mode
Nov 25 14:17:23 np0005535656 NetworkManager[55548]: <info>  [1764098243.6930] device (tap6d0fcfbf-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:17:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:17:23Z|00171|binding|INFO|Releasing lport 6d0fcfbf-d5de-4b58-9223-ed19141e11fb from this chassis (sb_readonly=0)
Nov 25 14:17:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:17:23Z|00172|binding|INFO|Setting lport 6d0fcfbf-d5de-4b58-9223-ed19141e11fb down in Southbound
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.703 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:17:23Z|00173|binding|INFO|Removing iface tap6d0fcfbf-d5 ovn-installed in OVS
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.706 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.712 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:15:92 10.100.0.10'], port_security=['fa:16:3e:ef:15:92 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bbf75eb3-0515-4610-a8c5-d8999a111b47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab3670f92d82410b981d159346c0c038', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2f57892c-3db7-4fb0-bf1d-cbd530236202', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=612db7fa-9536-4e67-bcd7-1cd2faf68d26, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=6d0fcfbf-d5de-4b58-9223-ed19141e11fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.714 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb in datapath 8e881e87-b103-4ad8-8de5-f8f4f0a10891 unbound from our chassis#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.716 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e881e87-b103-4ad8-8de5-f8f4f0a10891, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.718 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[82a89aee-f23e-4ca1-a453-bd3500b76418]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.719 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 namespace which is not needed anymore#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.734 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:23 np0005535656 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 25 14:17:23 np0005535656 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Consumed 15.929s CPU time.
Nov 25 14:17:23 np0005535656 systemd-machined[153481]: Machine qemu-15-instance-00000017 terminated.
Nov 25 14:17:23 np0005535656 podman[217602]: 2025-11-25 19:17:23.797856407 +0000 UTC m=+0.073141476 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:17:23 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [NOTICE]   (217294) : haproxy version is 2.8.14-c23fe91
Nov 25 14:17:23 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [NOTICE]   (217294) : path to executable is /usr/sbin/haproxy
Nov 25 14:17:23 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [WARNING]  (217294) : Exiting Master process...
Nov 25 14:17:23 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [ALERT]    (217294) : Current worker (217296) exited with code 143 (Terminated)
Nov 25 14:17:23 np0005535656 neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891[217290]: [WARNING]  (217294) : All workers exited. Exiting... (0)
Nov 25 14:17:23 np0005535656 systemd[1]: libpod-21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1.scope: Deactivated successfully.
Nov 25 14:17:23 np0005535656 podman[217649]: 2025-11-25 19:17:23.879670015 +0000 UTC m=+0.051445894 container died 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 25 14:17:23 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1-userdata-shm.mount: Deactivated successfully.
Nov 25 14:17:23 np0005535656 systemd[1]: var-lib-containers-storage-overlay-f09ef4e80801c04a4e2e73d28b1fa6ca390ce4ebf86b5858e82d1d42ca430dd3-merged.mount: Deactivated successfully.
Nov 25 14:17:23 np0005535656 podman[217649]: 2025-11-25 19:17:23.917528832 +0000 UTC m=+0.089304691 container cleanup 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.919 187223 DEBUG nova.compute.manager [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.920 187223 DEBUG oslo_concurrency.lockutils [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.920 187223 DEBUG oslo_concurrency.lockutils [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.920 187223 DEBUG oslo_concurrency.lockutils [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.921 187223 DEBUG nova.compute.manager [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.921 187223 DEBUG nova.compute.manager [req-05f64dbc-1c04-434f-a9ba-425412754f62 req-42274b31-19da-4e4a-ae31-ecdb6b80ca8a 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:17:23 np0005535656 systemd[1]: libpod-conmon-21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1.scope: Deactivated successfully.
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.943 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.943 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:17:23 np0005535656 nova_compute[187219]: 2025-11-25 19:17:23.944 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:17:23 np0005535656 podman[217694]: 2025-11-25 19:17:23.990564634 +0000 UTC m=+0.046431568 container remove 21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.995 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a59e51-6555-48cc-aab1-fc7946d4da82]: (4, ('Tue Nov 25 07:17:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1)\n21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1\nTue Nov 25 07:17:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 (21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1)\n21a87988134e708ed5200e6b4b36cfba832fbfd6cb3de6c840ed4b606edcffc1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.997 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed4e5ce-b412-4509-8598-959b5c84fe10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:23.998 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e881e87-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.000 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:24 np0005535656 kernel: tap8e881e87-b0: left promiscuous mode
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.054 187223 DEBUG nova.virt.libvirt.guest [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'bbf75eb3-0515-4610-a8c5-d8999a111b47' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.055 187223 INFO nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migration operation has completed#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.055 187223 INFO nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] _post_live_migration() is started..#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.082 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.085 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[601cafe0-1597-4853-a493-4cda0c966281]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.110 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[32cd06b5-c889-4a8f-b862-2700fd573218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.112 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[60ba1621-0f54-46f1-9da5-e3abbbfe9b53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.127 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[456442e4-0b1c-499a-b8e1-131bb2de1391]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513734, 'reachable_time': 30436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217714, 'error': None, 'target': 'ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:24 np0005535656 systemd[1]: run-netns-ovnmeta\x2d8e881e87\x2db103\x2d4ad8\x2d8de5\x2df8f4f0a10891.mount: Deactivated successfully.
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.132 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e881e87-b103-4ad8-8de5-f8f4f0a10891 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:17:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:24.132 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[8c51edf4-51c8-4f30-817e-9961e0da7ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.468 187223 DEBUG nova.network.neutron [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updated VIF entry in instance network info cache for port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.469 187223 DEBUG nova.network.neutron [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Updating instance_info_cache with network_info: [{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.475 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:24 np0005535656 nova_compute[187219]: 2025-11-25 19:17:24.499 187223 DEBUG oslo_concurrency.lockutils [req-b541f258-4bb8-4c60-b48f-746ea0f35d0e req-ac8aa4f8-4136-4747-ad3a-fb0bc1528c90 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-bbf75eb3-0515-4610-a8c5-d8999a111b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:17:25 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:25.375 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.419 187223 DEBUG nova.network.neutron [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 6d0fcfbf-d5de-4b58-9223-ed19141e11fb and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.420 187223 DEBUG nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.420 187223 DEBUG nova.virt.libvirt.vif [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:16:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1209546672',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1209546672',id=23,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:16:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ab3670f92d82410b981d159346c0c038',ramdisk_id='',reservation_id='r-xxg3mc0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-2025590332',owner_user_name='tempest-TestExecuteStrategies-2025590332-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:17:11Z,user_data=None,user_id='e60aa8a36ef94fa186a5c8de1df9e594',uuid=bbf75eb3-0515-4610-a8c5-d8999a111b47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.421 187223 DEBUG nova.network.os_vif_util [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "address": "fa:16:3e:ef:15:92", "network": {"id": "8e881e87-b103-4ad8-8de5-f8f4f0a10891", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-622246148-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ab3670f92d82410b981d159346c0c038", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0fcfbf-d5", "ovs_interfaceid": "6d0fcfbf-d5de-4b58-9223-ed19141e11fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.421 187223 DEBUG nova.network.os_vif_util [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.421 187223 DEBUG os_vif [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.423 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.423 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d0fcfbf-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.424 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.426 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.428 187223 INFO os_vif [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:15:92,bridge_name='br-int',has_traffic_filtering=True,id=6d0fcfbf-d5de-4b58-9223-ed19141e11fb,network=Network(8e881e87-b103-4ad8-8de5-f8f4f0a10891),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0fcfbf-d5')#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.429 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.429 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.429 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.430 187223 DEBUG nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.430 187223 INFO nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Deleting instance files /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47_del#033[00m
Nov 25 14:17:25 np0005535656 nova_compute[187219]: 2025-11-25 19:17:25.431 187223 INFO nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Deletion of /var/lib/nova/instances/bbf75eb3-0515-4610-a8c5-d8999a111b47_del complete#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.015 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.016 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.016 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.017 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.017 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.018 187223 WARNING nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.018 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.019 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.019 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.019 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.020 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.020 187223 WARNING nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.020 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.021 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.021 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.021 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.022 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.022 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-unplugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.022 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.022 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 WARNING nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.023 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.024 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.024 187223 DEBUG oslo_concurrency.lockutils [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.024 187223 DEBUG nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] No waiting events found dispatching network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:17:26 np0005535656 nova_compute[187219]: 2025-11-25 19:17:26.024 187223 WARNING nova.compute.manager [req-20d45946-4e63-43bb-89f1-fa5706f95567 req-8ea13da6-d0bb-4a20-a6b2-f3c96f2f86b2 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Received unexpected event network-vif-plugged-6d0fcfbf-d5de-4b58-9223-ed19141e11fb for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:17:27 np0005535656 nova_compute[187219]: 2025-11-25 19:17:27.979 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:28 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:17:28 np0005535656 systemd[217568]: Activating special unit Exit the Session...
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped target Main User Target.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped target Basic System.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped target Paths.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped target Sockets.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped target Timers.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:17:28 np0005535656 systemd[217568]: Closed D-Bus User Message Bus Socket.
Nov 25 14:17:28 np0005535656 systemd[217568]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:17:28 np0005535656 systemd[217568]: Removed slice User Application Slice.
Nov 25 14:17:28 np0005535656 systemd[217568]: Reached target Shutdown.
Nov 25 14:17:28 np0005535656 systemd[217568]: Finished Exit the Session.
Nov 25 14:17:28 np0005535656 systemd[217568]: Reached target Exit the Session.
Nov 25 14:17:28 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:17:28 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:17:28 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:17:28 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:17:28 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:17:28 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:17:28 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.426 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.750 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.750 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.750 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "bbf75eb3-0515-4610-a8c5-d8999a111b47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.769 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.770 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.770 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.770 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.916 187223 WARNING nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.918 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5879MB free_disk=73.1628532409668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.918 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.918 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:30 np0005535656 nova_compute[187219]: 2025-11-25 19:17:30.990 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance bbf75eb3-0515-4610-a8c5-d8999a111b47 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.015 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.060 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration 12e2895c-c42b-493d-91f0-2b76bf51b3c9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.060 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.060 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.196 187223 DEBUG nova.compute.provider_tree [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.211 187223 DEBUG nova.scheduler.client.report [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.246 187223 DEBUG nova.compute.resource_tracker [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.247 187223 DEBUG oslo_concurrency.lockutils [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.251 187223 INFO nova.compute.manager [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.402 187223 INFO nova.scheduler.client.report [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 12e2895c-c42b-493d-91f0-2b76bf51b3c9#033[00m
Nov 25 14:17:31 np0005535656 nova_compute[187219]: 2025-11-25 19:17:31.403 187223 DEBUG nova.virt.libvirt.driver [None req-78c4d0e9-2037-4fa4-bed8-dab5fcbbe878 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:17:33 np0005535656 nova_compute[187219]: 2025-11-25 19:17:33.017 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:33 np0005535656 podman[217719]: 2025-11-25 19:17:33.974836803 +0000 UTC m=+0.085611161 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 25 14:17:34 np0005535656 podman[217718]: 2025-11-25 19:17:34.016611646 +0000 UTC m=+0.128793771 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:17:35 np0005535656 nova_compute[187219]: 2025-11-25 19:17:35.437 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:35 np0005535656 podman[197580]: time="2025-11-25T19:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:17:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:17:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 25 14:17:38 np0005535656 nova_compute[187219]: 2025-11-25 19:17:38.018 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:38 np0005535656 nova_compute[187219]: 2025-11-25 19:17:38.939 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098243.937971, bbf75eb3-0515-4610-a8c5-d8999a111b47 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:17:38 np0005535656 nova_compute[187219]: 2025-11-25 19:17:38.939 187223 INFO nova.compute.manager [-] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:17:38 np0005535656 nova_compute[187219]: 2025-11-25 19:17:38.960 187223 DEBUG nova.compute.manager [None req-42ea7f5e-918b-43d7-9487-34b1e6ef10e1 - - - - - -] [instance: bbf75eb3-0515-4610-a8c5-d8999a111b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:17:39 np0005535656 podman[217761]: 2025-11-25 19:17:39.946540833 +0000 UTC m=+0.066651971 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 14:17:40 np0005535656 nova_compute[187219]: 2025-11-25 19:17:40.441 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:43 np0005535656 nova_compute[187219]: 2025-11-25 19:17:43.020 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:43 np0005535656 podman[217783]: 2025-11-25 19:17:43.934097619 +0000 UTC m=+0.061685547 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 25 14:17:45 np0005535656 nova_compute[187219]: 2025-11-25 19:17:45.444 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:48 np0005535656 nova_compute[187219]: 2025-11-25 19:17:48.022 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:17:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:17:50 np0005535656 nova_compute[187219]: 2025-11-25 19:17:50.486 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:53 np0005535656 nova_compute[187219]: 2025-11-25 19:17:53.074 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:53 np0005535656 podman[217804]: 2025-11-25 19:17:53.910506366 +0000 UTC m=+0.039862083 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:17:55 np0005535656 nova_compute[187219]: 2025-11-25 19:17:55.489 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:58 np0005535656 nova_compute[187219]: 2025-11-25 19:17:58.074 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:17:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:59.094 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:17:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:59.094 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:17:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:17:59.095 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:00 np0005535656 nova_compute[187219]: 2025-11-25 19:18:00.492 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:03 np0005535656 nova_compute[187219]: 2025-11-25 19:18:03.107 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:04 np0005535656 podman[217829]: 2025-11-25 19:18:04.958536991 +0000 UTC m=+0.085164919 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:18:04 np0005535656 podman[217830]: 2025-11-25 19:18:04.973166535 +0000 UTC m=+0.087532803 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 14:18:05 np0005535656 nova_compute[187219]: 2025-11-25 19:18:05.493 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:05 np0005535656 podman[197580]: time="2025-11-25T19:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:18:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:18:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 25 14:18:06 np0005535656 nova_compute[187219]: 2025-11-25 19:18:06.710 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:07 np0005535656 nova_compute[187219]: 2025-11-25 19:18:07.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:07 np0005535656 nova_compute[187219]: 2025-11-25 19:18:07.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:18:07 np0005535656 nova_compute[187219]: 2025-11-25 19:18:07.674 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:18:07 np0005535656 nova_compute[187219]: 2025-11-25 19:18:07.697 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:18:08 np0005535656 nova_compute[187219]: 2025-11-25 19:18:08.110 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:10 np0005535656 nova_compute[187219]: 2025-11-25 19:18:10.529 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:10 np0005535656 nova_compute[187219]: 2025-11-25 19:18:10.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:10 np0005535656 podman[217874]: 2025-11-25 19:18:10.966227759 +0000 UTC m=+0.087035919 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc.)
Nov 25 14:18:11 np0005535656 nova_compute[187219]: 2025-11-25 19:18:11.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:12 np0005535656 nova_compute[187219]: 2025-11-25 19:18:12.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:12 np0005535656 nova_compute[187219]: 2025-11-25 19:18:12.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:13 np0005535656 nova_compute[187219]: 2025-11-25 19:18:13.149 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:14 np0005535656 podman[217895]: 2025-11-25 19:18:14.933042246 +0000 UTC m=+0.056504828 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 25 14:18:15 np0005535656 nova_compute[187219]: 2025-11-25 19:18:15.532 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:15 np0005535656 nova_compute[187219]: 2025-11-25 19:18:15.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:16 np0005535656 nova_compute[187219]: 2025-11-25 19:18:16.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:16 np0005535656 nova_compute[187219]: 2025-11-25 19:18:16.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:16 np0005535656 nova_compute[187219]: 2025-11-25 19:18:16.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.807 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.808 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.808 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.809 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.981 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.982 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.16275024414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.982 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:17 np0005535656 nova_compute[187219]: 2025-11-25 19:18:17.982 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.059 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.059 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.074 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.103 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.103 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.125 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.144 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.151 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.177 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.200 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.202 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:18:18 np0005535656 nova_compute[187219]: 2025-11-25 19:18:18.203 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:18:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:18:20 np0005535656 nova_compute[187219]: 2025-11-25 19:18:20.575 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:23 np0005535656 nova_compute[187219]: 2025-11-25 19:18:23.189 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:23.493 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:18:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:23.493 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:18:23 np0005535656 nova_compute[187219]: 2025-11-25 19:18:23.494 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:23.494 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:24 np0005535656 podman[217915]: 2025-11-25 19:18:24.974987549 +0000 UTC m=+0.079353288 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:18:25 np0005535656 nova_compute[187219]: 2025-11-25 19:18:25.609 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:28 np0005535656 nova_compute[187219]: 2025-11-25 19:18:28.191 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:30 np0005535656 nova_compute[187219]: 2025-11-25 19:18:30.627 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:33 np0005535656 nova_compute[187219]: 2025-11-25 19:18:33.225 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:35 np0005535656 nova_compute[187219]: 2025-11-25 19:18:35.630 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:35 np0005535656 podman[197580]: time="2025-11-25T19:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:18:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:18:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 25 14:18:35 np0005535656 podman[217941]: 2025-11-25 19:18:35.975138846 +0000 UTC m=+0.073720647 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 25 14:18:36 np0005535656 podman[217940]: 2025-11-25 19:18:36.033874297 +0000 UTC m=+0.139095097 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:18:38 np0005535656 nova_compute[187219]: 2025-11-25 19:18:38.227 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:40 np0005535656 nova_compute[187219]: 2025-11-25 19:18:40.633 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:42 np0005535656 podman[217985]: 2025-11-25 19:18:42.01436003 +0000 UTC m=+0.121125513 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 14:18:43 np0005535656 nova_compute[187219]: 2025-11-25 19:18:43.281 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:45 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:45Z|00174|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 25 14:18:45 np0005535656 nova_compute[187219]: 2025-11-25 19:18:45.636 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:45 np0005535656 podman[218007]: 2025-11-25 19:18:45.952400141 +0000 UTC m=+0.063333027 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.415 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.415 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.433 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.518 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.519 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.527 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.528 187223 INFO nova.compute.claims [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.683 187223 DEBUG nova.compute.provider_tree [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.705 187223 DEBUG nova.scheduler.client.report [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.743 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.744 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.823 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.823 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.853 187223 INFO nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.874 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.972 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.974 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.974 187223 INFO nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Creating image(s)#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.975 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.975 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.976 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:46 np0005535656 nova_compute[187219]: 2025-11-25 19:18:46.989 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.072 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.073 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.074 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.084 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.144 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.145 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.177 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.178 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.179 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.267 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.269 187223 DEBUG nova.virt.disk.api [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Checking if we can resize image /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.270 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.360 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.361 187223 DEBUG nova.virt.disk.api [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Cannot resize image /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.362 187223 DEBUG nova.objects.instance [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lazy-loading 'migration_context' on Instance uuid ef106817-b316-4585-877b-b4c688fcc3a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.377 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.378 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Ensure instance console log exists: /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.378 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.378 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.379 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:47 np0005535656 nova_compute[187219]: 2025-11-25 19:18:47.486 187223 DEBUG nova.policy [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'faa0e27f31f840699feb7befa5b86f95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b3851241b16047ed9445aa3074f8dc4c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:18:48 np0005535656 nova_compute[187219]: 2025-11-25 19:18:48.283 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:48 np0005535656 nova_compute[187219]: 2025-11-25 19:18:48.477 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Successfully created port: 978970e7-0207-4f7c-a2e0-78a09ef9c57f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.281 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Successfully updated port: 978970e7-0207-4f7c-a2e0-78a09ef9c57f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.299 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.299 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquired lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.300 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.439 187223 DEBUG nova.compute.manager [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-changed-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.440 187223 DEBUG nova.compute.manager [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Refreshing instance network info cache due to event network-changed-978970e7-0207-4f7c-a2e0-78a09ef9c57f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.440 187223 DEBUG oslo_concurrency.lockutils [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:18:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:18:49 np0005535656 nova_compute[187219]: 2025-11-25 19:18:49.474 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.456 187223 DEBUG nova.network.neutron [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updating instance_info_cache with network_info: [{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.481 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Releasing lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.482 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Instance network_info: |[{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.482 187223 DEBUG oslo_concurrency.lockutils [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.482 187223 DEBUG nova.network.neutron [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Refreshing network info cache for port 978970e7-0207-4f7c-a2e0-78a09ef9c57f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.484 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Start _get_guest_xml network_info=[{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.488 187223 WARNING nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.492 187223 DEBUG nova.virt.libvirt.host [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.492 187223 DEBUG nova.virt.libvirt.host [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.495 187223 DEBUG nova.virt.libvirt.host [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.495 187223 DEBUG nova.virt.libvirt.host [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.496 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.497 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.497 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.497 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.497 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.498 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.498 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.498 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.498 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.499 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.499 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.499 187223 DEBUG nova.virt.hardware [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.504 187223 DEBUG nova.virt.libvirt.vif [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:18:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1188360108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1188360108',id=25,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3851241b16047ed9445aa3074f8dc4c',ramdisk_id='',reservation_id='r-ckgeuxby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:18:46Z,user_data=None,user_id='faa0e27f31f840699feb7befa5b86f95',uuid=ef106817-b316-4585-877b-b4c688fcc3a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.504 187223 DEBUG nova.network.os_vif_util [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converting VIF {"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.505 187223 DEBUG nova.network.os_vif_util [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.505 187223 DEBUG nova.objects.instance [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lazy-loading 'pci_devices' on Instance uuid ef106817-b316-4585-877b-b4c688fcc3a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.523 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <uuid>ef106817-b316-4585-877b-b4c688fcc3a0</uuid>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <name>instance-00000019</name>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1188360108</nova:name>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:18:50</nova:creationTime>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:user uuid="faa0e27f31f840699feb7befa5b86f95">tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member</nova:user>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:project uuid="b3851241b16047ed9445aa3074f8dc4c">tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419</nova:project>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        <nova:port uuid="978970e7-0207-4f7c-a2e0-78a09ef9c57f">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="serial">ef106817-b316-4585-877b-b4c688fcc3a0</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="uuid">ef106817-b316-4585-877b-b4c688fcc3a0</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.config"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:76:d2:0f"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <target dev="tap978970e7-02"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/console.log" append="off"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:18:50 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:18:50 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:18:50 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:18:50 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.524 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Preparing to wait for external event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.524 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.525 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.525 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.526 187223 DEBUG nova.virt.libvirt.vif [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:18:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1188360108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1188360108',id=25,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b3851241b16047ed9445aa3074f8dc4c',ramdisk_id='',reservation_id='r-ckgeuxby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:18:46Z,user_data=None,user_id='faa0e27f31f840699feb7befa5b86f95',uuid=ef106817-b316-4585-877b-b4c688fcc3a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.526 187223 DEBUG nova.network.os_vif_util [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converting VIF {"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.527 187223 DEBUG nova.network.os_vif_util [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.527 187223 DEBUG os_vif [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.528 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.528 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.529 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.531 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.531 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap978970e7-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.532 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap978970e7-02, col_values=(('external_ids', {'iface-id': '978970e7-0207-4f7c-a2e0-78a09ef9c57f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:d2:0f', 'vm-uuid': 'ef106817-b316-4585-877b-b4c688fcc3a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.596 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:50 np0005535656 NetworkManager[55548]: <info>  [1764098330.5975] manager: (tap978970e7-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.598 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.602 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.603 187223 INFO os_vif [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02')#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.651 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.652 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.652 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] No VIF found with MAC fa:16:3e:76:d2:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:18:50 np0005535656 nova_compute[187219]: 2025-11-25 19:18:50.653 187223 INFO nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Using config drive#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.053 187223 INFO nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Creating config drive at /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.config#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.057 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzk3rvwsg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.194 187223 DEBUG oslo_concurrency.processutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzk3rvwsg" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:18:51 np0005535656 kernel: tap978970e7-02: entered promiscuous mode
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.293 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.2926] manager: (tap978970e7-02): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Nov 25 14:18:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:51Z|00175|binding|INFO|Claiming lport 978970e7-0207-4f7c-a2e0-78a09ef9c57f for this chassis.
Nov 25 14:18:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:51Z|00176|binding|INFO|978970e7-0207-4f7c-a2e0-78a09ef9c57f: Claiming fa:16:3e:76:d2:0f 10.100.0.7
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.307 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.327 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:d2:0f 10.100.0.7'], port_security=['fa:16:3e:76:d2:0f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ef106817-b316-4585-877b-b4c688fcc3a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54268334-dbc3-41de-8b55-7e2418c08455', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3851241b16047ed9445aa3074f8dc4c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f677cd87-1f26-4c83-89d6-94af98ee5763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3375ebcc-f5c9-4616-a48e-8b0a6a2fcf3d, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=978970e7-0207-4f7c-a2e0-78a09ef9c57f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.329 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 978970e7-0207-4f7c-a2e0-78a09ef9c57f in datapath 54268334-dbc3-41de-8b55-7e2418c08455 bound to our chassis#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.331 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54268334-dbc3-41de-8b55-7e2418c08455#033[00m
Nov 25 14:18:51 np0005535656 systemd-machined[153481]: New machine qemu-16-instance-00000019.
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.351 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3bcf073f-6207-473b-a4c2-f0644dca7962]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.352 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap54268334-d1 in ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.355 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap54268334-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.355 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc603d3-1435-4687-9a3b-3bc2b952dd64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.356 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b69a312e-8e29-48fd-a40b-7222d01edbe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.373 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[de738a50-7d45-4178-8172-06f7b82e77e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 systemd[1]: Started Virtual Machine qemu-16-instance-00000019.
Nov 25 14:18:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:51Z|00177|binding|INFO|Setting lport 978970e7-0207-4f7c-a2e0-78a09ef9c57f ovn-installed in OVS
Nov 25 14:18:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:51Z|00178|binding|INFO|Setting lport 978970e7-0207-4f7c-a2e0-78a09ef9c57f up in Southbound
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.391 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 systemd-udevd[218064]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.402 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7d0267-7581-4201-abd3-b6d62273e3f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.4171] device (tap978970e7-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.4179] device (tap978970e7-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.438 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[d8815ef0-acd1-45db-a1e6-ca7ef48e806c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.446 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b28cc150-5064-43ef-8632-ace5ce9fe262]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.4471] manager: (tap54268334-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.480 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[1baa042b-62e8-42f2-b6ee-e2f48831baff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.482 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[361c8cec-bcbd-48ae-9658-934c7d5ab306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.5046] device (tap54268334-d0): carrier: link connected
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.510 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[17860cbd-8b99-4b95-8482-050110d3a8cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.533 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6f3bbd-25b3-44a6-8e11-b1c113bc998b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54268334-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:bd:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529303, 'reachable_time': 41643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218095, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.551 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e10886b2-c682-4da3-b8a4-4a52c91a35f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:bd6b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529303, 'tstamp': 529303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218101, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.569 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[49178e73-a10c-4ebb-91e2-96ca6374cddd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54268334-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:bd:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529303, 'reachable_time': 41643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218102, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.607 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5a1e780c-88ad-4a07-9a01-adfe7276551f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.622 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098331.621813, ef106817-b316-4585-877b-b4c688fcc3a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.623 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] VM Started (Lifecycle Event)#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.664 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.669 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098331.6219394, ef106817-b316-4585-877b-b4c688fcc3a0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.669 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.680 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ad23ee-d56b-45a4-a41f-1236257fd6bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.682 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54268334-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.682 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.683 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54268334-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.699 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.704 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:18:51 np0005535656 NetworkManager[55548]: <info>  [1764098331.7200] manager: (tap54268334-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 25 14:18:51 np0005535656 kernel: tap54268334-d0: entered promiscuous mode
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.719 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.722 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.723 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54268334-d0, col_values=(('external_ids', {'iface-id': '69abfc8d-0dc0-43eb-8594-b550e923fb09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.725 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:18:51Z|00179|binding|INFO|Releasing lport 69abfc8d-0dc0-43eb-8594-b550e923fb09 from this chassis (sb_readonly=0)
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.742 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.744 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/54268334-dbc3-41de-8b55-7e2418c08455.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/54268334-dbc3-41de-8b55-7e2418c08455.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.746 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[af2b77a7-7552-4561-9214-d60972858b2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.747 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-54268334-dbc3-41de-8b55-7e2418c08455
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/54268334-dbc3-41de-8b55-7e2418c08455.pid.haproxy
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 54268334-dbc3-41de-8b55-7e2418c08455
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:18:51 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:51.748 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'env', 'PROCESS_TAG=haproxy-54268334-dbc3-41de-8b55-7e2418c08455', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/54268334-dbc3-41de-8b55-7e2418c08455.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:18:51 np0005535656 nova_compute[187219]: 2025-11-25 19:18:51.749 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:18:52 np0005535656 podman[218135]: 2025-11-25 19:18:52.168652954 +0000 UTC m=+0.059722009 container create b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 25 14:18:52 np0005535656 systemd[1]: Started libpod-conmon-b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf.scope.
Nov 25 14:18:52 np0005535656 podman[218135]: 2025-11-25 19:18:52.144872663 +0000 UTC m=+0.035941708 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:18:52 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:18:52 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/140d571bb66a18462e767c4d8e03b7266d3dea1102cc65d247302ce4d903e3ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:18:52 np0005535656 podman[218135]: 2025-11-25 19:18:52.272903551 +0000 UTC m=+0.163972666 container init b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:18:52 np0005535656 podman[218135]: 2025-11-25 19:18:52.278244724 +0000 UTC m=+0.169313789 container start b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 14:18:52 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [NOTICE]   (218154) : New worker (218156) forked
Nov 25 14:18:52 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [NOTICE]   (218154) : Loading success.
Nov 25 14:18:53 np0005535656 nova_compute[187219]: 2025-11-25 19:18:53.286 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:53 np0005535656 nova_compute[187219]: 2025-11-25 19:18:53.539 187223 DEBUG nova.network.neutron [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updated VIF entry in instance network info cache for port 978970e7-0207-4f7c-a2e0-78a09ef9c57f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:18:53 np0005535656 nova_compute[187219]: 2025-11-25 19:18:53.540 187223 DEBUG nova.network.neutron [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updating instance_info_cache with network_info: [{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:18:53 np0005535656 nova_compute[187219]: 2025-11-25 19:18:53.568 187223 DEBUG oslo_concurrency.lockutils [req-aae7f0dd-f8ca-4185-be35-58460f62099a req-6a147669-07d6-429d-8132-975d15d5df26 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.567 187223 DEBUG nova.compute.manager [req-94d8bb7e-5648-4988-ad52-3663fdeef391 req-7b41494d-5bba-41c6-b434-a4f43ac68325 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.568 187223 DEBUG oslo_concurrency.lockutils [req-94d8bb7e-5648-4988-ad52-3663fdeef391 req-7b41494d-5bba-41c6-b434-a4f43ac68325 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.569 187223 DEBUG oslo_concurrency.lockutils [req-94d8bb7e-5648-4988-ad52-3663fdeef391 req-7b41494d-5bba-41c6-b434-a4f43ac68325 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.569 187223 DEBUG oslo_concurrency.lockutils [req-94d8bb7e-5648-4988-ad52-3663fdeef391 req-7b41494d-5bba-41c6-b434-a4f43ac68325 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.570 187223 DEBUG nova.compute.manager [req-94d8bb7e-5648-4988-ad52-3663fdeef391 req-7b41494d-5bba-41c6-b434-a4f43ac68325 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Processing event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.571 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.578 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098334.5782073, ef106817-b316-4585-877b-b4c688fcc3a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.579 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.585 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.591 187223 INFO nova.virt.libvirt.driver [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Instance spawned successfully.#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.591 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.612 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.619 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.623 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.624 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.624 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.625 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.626 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.626 187223 DEBUG nova.virt.libvirt.driver [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.650 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.675 187223 INFO nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Took 7.70 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.676 187223 DEBUG nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.747 187223 INFO nova.compute.manager [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Took 8.27 seconds to build instance.#033[00m
Nov 25 14:18:54 np0005535656 nova_compute[187219]: 2025-11-25 19:18:54.762 187223 DEBUG oslo_concurrency.lockutils [None req-c763e488-080d-402a-b7d0-75b0b88e0b91 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:55 np0005535656 nova_compute[187219]: 2025-11-25 19:18:55.646 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:56 np0005535656 podman[218165]: 2025-11-25 19:18:56.000599118 +0000 UTC m=+0.106420046 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.767 187223 DEBUG nova.compute.manager [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.767 187223 DEBUG oslo_concurrency.lockutils [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.768 187223 DEBUG oslo_concurrency.lockutils [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.768 187223 DEBUG oslo_concurrency.lockutils [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.768 187223 DEBUG nova.compute.manager [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] No waiting events found dispatching network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:18:56 np0005535656 nova_compute[187219]: 2025-11-25 19:18:56.768 187223 WARNING nova.compute.manager [req-1a73592c-73d7-460b-a771-62298d1e3887 req-b0fd8fb4-36db-43f7-8150-43a247035790 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received unexpected event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f for instance with vm_state active and task_state None.#033[00m
Nov 25 14:18:58 np0005535656 nova_compute[187219]: 2025-11-25 19:18:58.288 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:18:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:59.096 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:18:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:59.097 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:18:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:18:59.097 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:19:00 np0005535656 nova_compute[187219]: 2025-11-25 19:19:00.650 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:03 np0005535656 nova_compute[187219]: 2025-11-25 19:19:03.343 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:05 np0005535656 podman[197580]: time="2025-11-25T19:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:19:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:19:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3067 "" "Go-http-client/1.1"
Nov 25 14:19:05 np0005535656 nova_compute[187219]: 2025-11-25 19:19:05.693 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:06 np0005535656 podman[218206]: 2025-11-25 19:19:06.951219161 +0000 UTC m=+0.061175619 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 14:19:07 np0005535656 podman[218205]: 2025-11-25 19:19:07.070892833 +0000 UTC m=+0.176194665 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 25 14:19:08 np0005535656 nova_compute[187219]: 2025-11-25 19:19:08.345 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:08 np0005535656 ovn_controller[95460]: 2025-11-25T19:19:08Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:76:d2:0f 10.100.0.7
Nov 25 14:19:08 np0005535656 ovn_controller[95460]: 2025-11-25T19:19:08Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:76:d2:0f 10.100.0.7
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.204 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.205 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.205 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.372 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.374 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.374 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:19:09 np0005535656 nova_compute[187219]: 2025-11-25 19:19:09.375 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid ef106817-b316-4585-877b-b4c688fcc3a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:19:10 np0005535656 nova_compute[187219]: 2025-11-25 19:19:10.736 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:10 np0005535656 nova_compute[187219]: 2025-11-25 19:19:10.853 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updating instance_info_cache with network_info: [{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:19:10 np0005535656 nova_compute[187219]: 2025-11-25 19:19:10.878 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:19:10 np0005535656 nova_compute[187219]: 2025-11-25 19:19:10.878 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:19:11 np0005535656 nova_compute[187219]: 2025-11-25 19:19:11.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:12 np0005535656 nova_compute[187219]: 2025-11-25 19:19:12.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:12 np0005535656 podman[218251]: 2025-11-25 19:19:12.990540237 +0000 UTC m=+0.097964199 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, io.openshift.expose-services=)
Nov 25 14:19:13 np0005535656 nova_compute[187219]: 2025-11-25 19:19:13.383 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:13 np0005535656 nova_compute[187219]: 2025-11-25 19:19:13.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:13 np0005535656 nova_compute[187219]: 2025-11-25 19:19:13.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:15 np0005535656 nova_compute[187219]: 2025-11-25 19:19:15.770 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:16 np0005535656 nova_compute[187219]: 2025-11-25 19:19:16.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:16 np0005535656 nova_compute[187219]: 2025-11-25 19:19:16.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:19:17 np0005535656 podman[218272]: 2025-11-25 19:19:17.258200803 +0000 UTC m=+0.062185975 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:19:17 np0005535656 nova_compute[187219]: 2025-11-25 19:19:17.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:17 np0005535656 nova_compute[187219]: 2025-11-25 19:19:17.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:18 np0005535656 nova_compute[187219]: 2025-11-25 19:19:18.386 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:19:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.720 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.720 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.720 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.720 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.807 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.880 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.882 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:19:19 np0005535656 nova_compute[187219]: 2025-11-25 19:19:19.968 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.170 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.172 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5711MB free_disk=73.13328170776367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.172 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.172 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.249 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance ef106817-b316-4585-877b-b4c688fcc3a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.249 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.249 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.303 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.320 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.347 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.347 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:19:20 np0005535656 nova_compute[187219]: 2025-11-25 19:19:20.821 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:23 np0005535656 nova_compute[187219]: 2025-11-25 19:19:23.414 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:25 np0005535656 nova_compute[187219]: 2025-11-25 19:19:25.872 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:26 np0005535656 podman[218301]: 2025-11-25 19:19:26.948475741 +0000 UTC m=+0.066898741 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:19:28 np0005535656 nova_compute[187219]: 2025-11-25 19:19:28.417 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:30 np0005535656 nova_compute[187219]: 2025-11-25 19:19:30.900 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:33 np0005535656 nova_compute[187219]: 2025-11-25 19:19:33.452 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:35 np0005535656 podman[197580]: time="2025-11-25T19:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:19:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:19:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Nov 25 14:19:35 np0005535656 nova_compute[187219]: 2025-11-25 19:19:35.939 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:36 np0005535656 ovn_controller[95460]: 2025-11-25T19:19:36Z|00180|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 14:19:37 np0005535656 podman[218325]: 2025-11-25 19:19:37.951978087 +0000 UTC m=+0.067334564 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 14:19:37 np0005535656 podman[218324]: 2025-11-25 19:19:37.9859157 +0000 UTC m=+0.114197885 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:19:38 np0005535656 nova_compute[187219]: 2025-11-25 19:19:38.472 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:40 np0005535656 nova_compute[187219]: 2025-11-25 19:19:40.944 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:43 np0005535656 nova_compute[187219]: 2025-11-25 19:19:43.516 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:43 np0005535656 podman[218369]: 2025-11-25 19:19:43.978882151 +0000 UTC m=+0.094770033 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 25 14:19:45 np0005535656 nova_compute[187219]: 2025-11-25 19:19:45.986 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:47 np0005535656 podman[218391]: 2025-11-25 19:19:47.980191684 +0000 UTC m=+0.089978414 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 25 14:19:48 np0005535656 nova_compute[187219]: 2025-11-25 19:19:48.555 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:19:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:19:50 np0005535656 nova_compute[187219]: 2025-11-25 19:19:50.989 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:53 np0005535656 nova_compute[187219]: 2025-11-25 19:19:53.596 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:55 np0005535656 nova_compute[187219]: 2025-11-25 19:19:55.994 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:57 np0005535656 podman[218411]: 2025-11-25 19:19:57.965956559 +0000 UTC m=+0.079335866 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:19:58 np0005535656 nova_compute[187219]: 2025-11-25 19:19:58.615 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:19:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:19:59.097 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:19:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:19:59.098 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:19:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:19:59.099 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:00 np0005535656 nova_compute[187219]: 2025-11-25 19:20:00.998 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:03 np0005535656 nova_compute[187219]: 2025-11-25 19:20:03.654 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:05 np0005535656 podman[197580]: time="2025-11-25T19:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:20:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:20:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 25 14:20:06 np0005535656 nova_compute[187219]: 2025-11-25 19:20:06.003 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:08 np0005535656 nova_compute[187219]: 2025-11-25 19:20:08.694 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:08 np0005535656 podman[218452]: 2025-11-25 19:20:08.956402444 +0000 UTC m=+0.063371877 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 14:20:09 np0005535656 podman[218451]: 2025-11-25 19:20:09.004257422 +0000 UTC m=+0.111334128 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.326 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.327 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.327 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.481 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.481 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.482 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:20:10 np0005535656 nova_compute[187219]: 2025-11-25 19:20:10.482 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid ef106817-b316-4585-877b-b4c688fcc3a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:20:11 np0005535656 nova_compute[187219]: 2025-11-25 19:20:11.007 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.419 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updating instance_info_cache with network_info: [{"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.441 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-ef106817-b316-4585-877b-b4c688fcc3a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.442 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.442 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.442 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:13 np0005535656 nova_compute[187219]: 2025-11-25 19:20:13.710 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:14 np0005535656 nova_compute[187219]: 2025-11-25 19:20:14.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:14 np0005535656 podman[218491]: 2025-11-25 19:20:14.974627201 +0000 UTC m=+0.087569888 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Nov 25 14:20:15 np0005535656 nova_compute[187219]: 2025-11-25 19:20:15.542 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Creating tmpfile /var/lib/nova/instances/tmpx7udtkmw to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 25 14:20:15 np0005535656 nova_compute[187219]: 2025-11-25 19:20:15.677 187223 DEBUG nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx7udtkmw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 25 14:20:16 np0005535656 nova_compute[187219]: 2025-11-25 19:20:16.011 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:16 np0005535656 nova_compute[187219]: 2025-11-25 19:20:16.453 187223 DEBUG nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx7udtkmw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 25 14:20:16 np0005535656 nova_compute[187219]: 2025-11-25 19:20:16.480 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:20:16 np0005535656 nova_compute[187219]: 2025-11-25 19:20:16.480 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:20:16 np0005535656 nova_compute[187219]: 2025-11-25 19:20:16.481 187223 DEBUG nova.network.neutron [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:20:17 np0005535656 nova_compute[187219]: 2025-11-25 19:20:17.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:17 np0005535656 nova_compute[187219]: 2025-11-25 19:20:17.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:17 np0005535656 nova_compute[187219]: 2025-11-25 19:20:17.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 14:20:18 np0005535656 nova_compute[187219]: 2025-11-25 19:20:18.683 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:18 np0005535656 nova_compute[187219]: 2025-11-25 19:20:18.683 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:20:18 np0005535656 nova_compute[187219]: 2025-11-25 19:20:18.747 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:18 np0005535656 podman[218512]: 2025-11-25 19:20:18.983481358 +0000 UTC m=+0.094344721 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:20:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.494 187223 DEBUG nova.network.neutron [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Updating instance_info_cache with network_info: [{"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.514 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.516 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx7udtkmw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.516 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Creating instance directory: /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.516 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Creating disk.info with the contents: {'/var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk': 'qcow2', '/var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.517 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.517 187223 DEBUG nova.objects.instance [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.546 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.633 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.634 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.635 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.649 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.715 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.717 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.753 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.755 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.756 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.852 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.853 187223 DEBUG nova.virt.disk.api [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Checking if we can resize image /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.854 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.942 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.945 187223 DEBUG nova.virt.disk.api [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Cannot resize image /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.945 187223 DEBUG nova.objects.instance [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.965 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.989 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.991 187223 DEBUG nova.virt.libvirt.volume.remotefs [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config to /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 25 14:20:19 np0005535656 nova_compute[187219]: 2025-11-25 19:20:19.991 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.496 187223 DEBUG oslo_concurrency.processutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3/disk.config /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.498 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.500 187223 DEBUG nova.virt.libvirt.vif [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:19:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1229429361',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1229429361',id=26,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:19:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b3851241b16047ed9445aa3074f8dc4c',ramdisk_id='',reservation_id='r-uyfbxu9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:19:11Z,user_data=None,user_id='faa0e27f31f840699feb7befa5b86f95',uuid=82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.500 187223 DEBUG nova.network.os_vif_util [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.502 187223 DEBUG nova.network.os_vif_util [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.503 187223 DEBUG os_vif [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.504 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.505 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.506 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.510 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.510 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6eff8ff5-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.511 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6eff8ff5-0e, col_values=(('external_ids', {'iface-id': '6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:d1:68', 'vm-uuid': '82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:20 np0005535656 NetworkManager[55548]: <info>  [1764098420.5146] manager: (tap6eff8ff5-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.517 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.527 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.529 187223 INFO os_vif [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e')#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.529 187223 DEBUG nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.530 187223 DEBUG nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx7udtkmw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.700 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.700 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.700 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.764 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.834 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.835 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:20:20 np0005535656 nova_compute[187219]: 2025-11-25 19:20:20.929 187223 DEBUG oslo_concurrency.processutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.133 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.134 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5702MB free_disk=73.13284301757812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.135 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.135 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.221 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Migration for instance 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.259 187223 INFO nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Updating resource usage from migration ab6abba1-b80a-4850-86f9-eb3db4eea201#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.260 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Starting to track incoming migration ab6abba1-b80a-4850-86f9-eb3db4eea201 with flavor a7ebe884-489b-45b6-89a1-4967aa291cd6 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.320 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance ef106817-b316-4585-877b-b4c688fcc3a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.360 187223 WARNING nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.361 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.361 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.486 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.501 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.518 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.518 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:21 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:21.647 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:20:21 np0005535656 nova_compute[187219]: 2025-11-25 19:20:21.648 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:21 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:21.649 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:20:22 np0005535656 nova_compute[187219]: 2025-11-25 19:20:22.614 187223 DEBUG nova.network.neutron [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Port 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 25 14:20:22 np0005535656 nova_compute[187219]: 2025-11-25 19:20:22.615 187223 DEBUG nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx7udtkmw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 25 14:20:22 np0005535656 systemd[1]: Starting libvirt proxy daemon...
Nov 25 14:20:22 np0005535656 systemd[1]: Started libvirt proxy daemon.
Nov 25 14:20:22 np0005535656 kernel: tap6eff8ff5-0e: entered promiscuous mode
Nov 25 14:20:22 np0005535656 NetworkManager[55548]: <info>  [1764098422.9007] manager: (tap6eff8ff5-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 25 14:20:22 np0005535656 nova_compute[187219]: 2025-11-25 19:20:22.902 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:22 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:22Z|00181|binding|INFO|Claiming lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c for this additional chassis.
Nov 25 14:20:22 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:22Z|00182|binding|INFO|6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c: Claiming fa:16:3e:9f:d1:68 10.100.0.5
Nov 25 14:20:22 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:22Z|00183|binding|INFO|Setting lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c ovn-installed in OVS
Nov 25 14:20:22 np0005535656 nova_compute[187219]: 2025-11-25 19:20:22.929 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:22 np0005535656 nova_compute[187219]: 2025-11-25 19:20:22.933 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:22 np0005535656 systemd-udevd[218593]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:20:22 np0005535656 NetworkManager[55548]: <info>  [1764098422.9638] device (tap6eff8ff5-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:20:22 np0005535656 NetworkManager[55548]: <info>  [1764098422.9652] device (tap6eff8ff5-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:20:22 np0005535656 systemd-machined[153481]: New machine qemu-17-instance-0000001a.
Nov 25 14:20:22 np0005535656 systemd[1]: Started Virtual Machine qemu-17-instance-0000001a.
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.616 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098423.6156428, 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.617 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] VM Started (Lifecycle Event)#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.652 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.695 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 14:20:23 np0005535656 nova_compute[187219]: 2025-11-25 19:20:23.788 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:24 np0005535656 nova_compute[187219]: 2025-11-25 19:20:24.363 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098424.3628023, 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:20:24 np0005535656 nova_compute[187219]: 2025-11-25 19:20:24.363 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:20:24 np0005535656 nova_compute[187219]: 2025-11-25 19:20:24.397 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:20:24 np0005535656 nova_compute[187219]: 2025-11-25 19:20:24.399 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:20:24 np0005535656 nova_compute[187219]: 2025-11-25 19:20:24.420 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 25 14:20:25 np0005535656 nova_compute[187219]: 2025-11-25 19:20:25.516 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:26Z|00184|binding|INFO|Claiming lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c for this chassis.
Nov 25 14:20:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:26Z|00185|binding|INFO|6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c: Claiming fa:16:3e:9f:d1:68 10.100.0.5
Nov 25 14:20:26 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:26Z|00186|binding|INFO|Setting lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c up in Southbound
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.596 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:d1:68 10.100.0.5'], port_security=['fa:16:3e:9f:d1:68 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54268334-dbc3-41de-8b55-7e2418c08455', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3851241b16047ed9445aa3074f8dc4c', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f677cd87-1f26-4c83-89d6-94af98ee5763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3375ebcc-f5c9-4616-a48e-8b0a6a2fcf3d, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.599 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c in datapath 54268334-dbc3-41de-8b55-7e2418c08455 bound to our chassis#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.602 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54268334-dbc3-41de-8b55-7e2418c08455#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.624 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e0051a-e9ab-4b63-a0f4-8f482670d2ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.678 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[4c326986-cd52-4ec2-bb4d-26397a93d29e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.682 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[250d99bf-866d-4267-9ad7-46f117472470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.734 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[eaccf004-14eb-4a03-900b-2363491bf5ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.761 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6134c0-4220-49ea-903b-818192d8a5c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54268334-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:bd:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529303, 'reachable_time': 44103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218629, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.785 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[dddc1ff2-3aa1-48d3-b656-91b621b19b95]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap54268334-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529315, 'tstamp': 529315}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218630, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap54268334-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529320, 'tstamp': 529320}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218630, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.788 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54268334-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:26 np0005535656 nova_compute[187219]: 2025-11-25 19:20:26.790 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:26 np0005535656 nova_compute[187219]: 2025-11-25 19:20:26.792 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.792 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54268334-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.793 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.793 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54268334-d0, col_values=(('external_ids', {'iface-id': '69abfc8d-0dc0-43eb-8594-b550e923fb09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:26 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:26.794 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:20:26 np0005535656 nova_compute[187219]: 2025-11-25 19:20:26.804 187223 INFO nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Post operation of migration started#033[00m
Nov 25 14:20:27 np0005535656 nova_compute[187219]: 2025-11-25 19:20:27.642 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:20:27 np0005535656 nova_compute[187219]: 2025-11-25 19:20:27.643 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:20:27 np0005535656 nova_compute[187219]: 2025-11-25 19:20:27.643 187223 DEBUG nova.network.neutron [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:20:28 np0005535656 nova_compute[187219]: 2025-11-25 19:20:28.824 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:28 np0005535656 podman[218631]: 2025-11-25 19:20:28.981506324 +0000 UTC m=+0.095038420 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.143 187223 DEBUG nova.network.neutron [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Updating instance_info_cache with network_info: [{"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.181 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.196 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.197 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.197 187223 DEBUG oslo_concurrency.lockutils [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.202 187223 INFO nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 25 14:20:29 np0005535656 virtqemud[186765]: Domain id=17 name='instance-0000001a' uuid=82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 is tainted: custom-monitor
Nov 25 14:20:29 np0005535656 nova_compute[187219]: 2025-11-25 19:20:29.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:20:30 np0005535656 nova_compute[187219]: 2025-11-25 19:20:30.211 187223 INFO nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 25 14:20:30 np0005535656 nova_compute[187219]: 2025-11-25 19:20:30.521 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:30 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:30.652 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:31 np0005535656 nova_compute[187219]: 2025-11-25 19:20:31.218 187223 INFO nova.virt.libvirt.driver [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 25 14:20:31 np0005535656 nova_compute[187219]: 2025-11-25 19:20:31.224 187223 DEBUG nova.compute.manager [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:20:31 np0005535656 nova_compute[187219]: 2025-11-25 19:20:31.244 187223 DEBUG nova.objects.instance [None req-58e98949-7f07-49fb-b513-9c34b913625a fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 25 14:20:33 np0005535656 nova_compute[187219]: 2025-11-25 19:20:33.868 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.523 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.605 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.606 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.607 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.607 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.608 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.610 187223 INFO nova.compute.manager [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Terminating instance#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.611 187223 DEBUG nova.compute.manager [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 14:20:35 np0005535656 kernel: tap6eff8ff5-0e (unregistering): left promiscuous mode
Nov 25 14:20:35 np0005535656 NetworkManager[55548]: <info>  [1764098435.6391] device (tap6eff8ff5-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:20:35 np0005535656 podman[197580]: time="2025-11-25T19:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:20:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:20:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:35Z|00187|binding|INFO|Releasing lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c from this chassis (sb_readonly=0)
Nov 25 14:20:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:35Z|00188|binding|INFO|Setting lport 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c down in Southbound
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.653 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:35Z|00189|binding|INFO|Removing iface tap6eff8ff5-0e ovn-installed in OVS
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.661 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3068 "" "Go-http-client/1.1"
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.683 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.682 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:d1:68 10.100.0.5'], port_security=['fa:16:3e:9f:d1:68 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54268334-dbc3-41de-8b55-7e2418c08455', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3851241b16047ed9445aa3074f8dc4c', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f677cd87-1f26-4c83-89d6-94af98ee5763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3375ebcc-f5c9-4616-a48e-8b0a6a2fcf3d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.685 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c in datapath 54268334-dbc3-41de-8b55-7e2418c08455 unbound from our chassis#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.688 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 54268334-dbc3-41de-8b55-7e2418c08455#033[00m
Nov 25 14:20:35 np0005535656 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 25 14:20:35 np0005535656 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001a.scope: Consumed 1.755s CPU time.
Nov 25 14:20:35 np0005535656 systemd-machined[153481]: Machine qemu-17-instance-0000001a terminated.
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.722 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9af68aab-9dab-4698-8d95-1ad9d5ffe245]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.763 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[9621ebe5-7d04-4f74-8713-2650ece34436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.767 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc48909-73f2-4803-9188-0c2050c610ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.812 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[01312f11-214b-4ae4-9081-11e7af2279cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.842 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.845 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[42fec63c-342a-4b72-88cb-afc528cb7ffe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap54268334-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:bd:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529303, 'reachable_time': 44103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218667, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.852 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.877 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3ce711-7c66-4c46-bc18-ceb8a6343f39]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap54268334-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529315, 'tstamp': 529315}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218675, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap54268334-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529320, 'tstamp': 529320}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218675, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.879 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54268334-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.881 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.888 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.889 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54268334-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.889 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.890 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap54268334-d0, col_values=(('external_ids', {'iface-id': '69abfc8d-0dc0-43eb-8594-b550e923fb09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:35 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:35.890 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.905 187223 INFO nova.virt.libvirt.driver [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Instance destroyed successfully.#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.906 187223 DEBUG nova.objects.instance [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lazy-loading 'resources' on Instance uuid 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.930 187223 DEBUG nova.virt.libvirt.vif [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-25T19:19:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1229429361',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1229429361',id=26,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:19:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b3851241b16047ed9445aa3074f8dc4c',ramdisk_id='',reservation_id='r-uyfbxu9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:20:31Z,user_data=None,user_id='faa0e27f31f840699feb7befa5b86f95',uuid=82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.930 187223 DEBUG nova.network.os_vif_util [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converting VIF {"id": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "address": "fa:16:3e:9f:d1:68", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eff8ff5-0e", "ovs_interfaceid": "6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.931 187223 DEBUG nova.network.os_vif_util [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.932 187223 DEBUG os_vif [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.934 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.935 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6eff8ff5-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.937 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.938 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.939 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.942 187223 INFO os_vif [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eff8ff5-0e')#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.943 187223 INFO nova.virt.libvirt.driver [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Deleting instance files /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3_del#033[00m
Nov 25 14:20:35 np0005535656 nova_compute[187219]: 2025-11-25 19:20:35.944 187223 INFO nova.virt.libvirt.driver [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Deletion of /var/lib/nova/instances/82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3_del complete#033[00m
Nov 25 14:20:36 np0005535656 nova_compute[187219]: 2025-11-25 19:20:36.017 187223 INFO nova.compute.manager [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 14:20:36 np0005535656 nova_compute[187219]: 2025-11-25 19:20:36.018 187223 DEBUG oslo.service.loopingcall [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 14:20:36 np0005535656 nova_compute[187219]: 2025-11-25 19:20:36.018 187223 DEBUG nova.compute.manager [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 14:20:36 np0005535656 nova_compute[187219]: 2025-11-25 19:20:36.018 187223 DEBUG nova.network.neutron [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.694 187223 DEBUG nova.compute.manager [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Received event network-vif-unplugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.696 187223 DEBUG oslo_concurrency.lockutils [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.696 187223 DEBUG oslo_concurrency.lockutils [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.697 187223 DEBUG oslo_concurrency.lockutils [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.697 187223 DEBUG nova.compute.manager [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] No waiting events found dispatching network-vif-unplugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.698 187223 DEBUG nova.compute.manager [req-b4d6afb1-ab10-4c3e-b9f3-9336cd3006cd req-7f7a91f4-1afa-4e4c-87e6-4e8296b9a90c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Received event network-vif-unplugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.865 187223 DEBUG nova.network.neutron [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.890 187223 INFO nova.compute.manager [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Took 1.87 seconds to deallocate network for instance.#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.944 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.945 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.950 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:37 np0005535656 nova_compute[187219]: 2025-11-25 19:20:37.999 187223 INFO nova.scheduler.client.report [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Deleted allocations for instance 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.056 187223 DEBUG oslo_concurrency.lockutils [None req-67a730e3-800f-4fe0-b011-04ea1049fad3 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.900 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.991 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.991 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.992 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.992 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.992 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.994 187223 INFO nova.compute.manager [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Terminating instance#033[00m
Nov 25 14:20:38 np0005535656 nova_compute[187219]: 2025-11-25 19:20:38.996 187223 DEBUG nova.compute.manager [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 25 14:20:39 np0005535656 kernel: tap978970e7-02 (unregistering): left promiscuous mode
Nov 25 14:20:39 np0005535656 NetworkManager[55548]: <info>  [1764098439.0258] device (tap978970e7-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:20:39 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:39Z|00190|binding|INFO|Releasing lport 978970e7-0207-4f7c-a2e0-78a09ef9c57f from this chassis (sb_readonly=0)
Nov 25 14:20:39 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:39Z|00191|binding|INFO|Setting lport 978970e7-0207-4f7c-a2e0-78a09ef9c57f down in Southbound
Nov 25 14:20:39 np0005535656 ovn_controller[95460]: 2025-11-25T19:20:39Z|00192|binding|INFO|Removing iface tap978970e7-02 ovn-installed in OVS
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.031 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.035 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:d2:0f 10.100.0.7'], port_security=['fa:16:3e:76:d2:0f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ef106817-b316-4585-877b-b4c688fcc3a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54268334-dbc3-41de-8b55-7e2418c08455', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3851241b16047ed9445aa3074f8dc4c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f677cd87-1f26-4c83-89d6-94af98ee5763', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3375ebcc-f5c9-4616-a48e-8b0a6a2fcf3d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=978970e7-0207-4f7c-a2e0-78a09ef9c57f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.037 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 978970e7-0207-4f7c-a2e0-78a09ef9c57f in datapath 54268334-dbc3-41de-8b55-7e2418c08455 unbound from our chassis#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.040 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54268334-dbc3-41de-8b55-7e2418c08455, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.041 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[692e1001-1988-4f5c-8327-0363eeb9c194]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.042 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455 namespace which is not needed anymore#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.063 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Deactivated successfully.
Nov 25 14:20:39 np0005535656 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Consumed 17.202s CPU time.
Nov 25 14:20:39 np0005535656 systemd-machined[153481]: Machine qemu-16-instance-00000019 terminated.
Nov 25 14:20:39 np0005535656 podman[218689]: 2025-11-25 19:20:39.14187461 +0000 UTC m=+0.067948731 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 14:20:39 np0005535656 podman[218686]: 2025-11-25 19:20:39.203710354 +0000 UTC m=+0.130797633 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 14:20:39 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [NOTICE]   (218154) : haproxy version is 2.8.14-c23fe91
Nov 25 14:20:39 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [NOTICE]   (218154) : path to executable is /usr/sbin/haproxy
Nov 25 14:20:39 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [WARNING]  (218154) : Exiting Master process...
Nov 25 14:20:39 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [ALERT]    (218154) : Current worker (218156) exited with code 143 (Terminated)
Nov 25 14:20:39 np0005535656 neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455[218150]: [WARNING]  (218154) : All workers exited. Exiting... (0)
Nov 25 14:20:39 np0005535656 systemd[1]: libpod-b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf.scope: Deactivated successfully.
Nov 25 14:20:39 np0005535656 kernel: tap978970e7-02: entered promiscuous mode
Nov 25 14:20:39 np0005535656 podman[218750]: 2025-11-25 19:20:39.219213202 +0000 UTC m=+0.048699213 container died b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:20:39 np0005535656 kernel: tap978970e7-02 (unregistering): left promiscuous mode
Nov 25 14:20:39 np0005535656 NetworkManager[55548]: <info>  [1764098439.2242] manager: (tap978970e7-02): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.228 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf-userdata-shm.mount: Deactivated successfully.
Nov 25 14:20:39 np0005535656 systemd[1]: var-lib-containers-storage-overlay-140d571bb66a18462e767c4d8e03b7266d3dea1102cc65d247302ce4d903e3ec-merged.mount: Deactivated successfully.
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.259 187223 INFO nova.virt.libvirt.driver [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Instance destroyed successfully.#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.260 187223 DEBUG nova.objects.instance [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lazy-loading 'resources' on Instance uuid ef106817-b316-4585-877b-b4c688fcc3a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:20:39 np0005535656 podman[218750]: 2025-11-25 19:20:39.265472107 +0000 UTC m=+0.094958128 container cleanup b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:20:39 np0005535656 systemd[1]: libpod-conmon-b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf.scope: Deactivated successfully.
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.294 187223 DEBUG nova.virt.libvirt.vif [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:18:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1188360108',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1188360108',id=25,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b3851241b16047ed9445aa3074f8dc4c',ramdisk_id='',reservation_id='r-ckgeuxby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1931304419-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:18:54Z,user_data=None,user_id='faa0e27f31f840699feb7befa5b86f95',uuid=ef106817-b316-4585-877b-b4c688fcc3a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.295 187223 DEBUG nova.network.os_vif_util [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converting VIF {"id": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "address": "fa:16:3e:76:d2:0f", "network": {"id": "54268334-dbc3-41de-8b55-7e2418c08455", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-96101273-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3851241b16047ed9445aa3074f8dc4c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap978970e7-02", "ovs_interfaceid": "978970e7-0207-4f7c-a2e0-78a09ef9c57f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.296 187223 DEBUG nova.network.os_vif_util [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.297 187223 DEBUG os_vif [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.298 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.298 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap978970e7-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.299 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.301 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.304 187223 INFO os_vif [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:d2:0f,bridge_name='br-int',has_traffic_filtering=True,id=978970e7-0207-4f7c-a2e0-78a09ef9c57f,network=Network(54268334-dbc3-41de-8b55-7e2418c08455),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap978970e7-02')#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.305 187223 INFO nova.virt.libvirt.driver [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Deleting instance files /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0_del#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.305 187223 INFO nova.virt.libvirt.driver [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Deletion of /var/lib/nova/instances/ef106817-b316-4585-877b-b4c688fcc3a0_del complete#033[00m
Nov 25 14:20:39 np0005535656 podman[218802]: 2025-11-25 19:20:39.340661812 +0000 UTC m=+0.051782485 container remove b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.347 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6dac297e-aab6-48c7-b1c8-70ef92c83150]: (4, ('Tue Nov 25 07:20:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455 (b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf)\nb087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf\nTue Nov 25 07:20:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455 (b087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf)\nb087af05eabe9132497bf5ce8d1deaff9b166b6e3702047d075d8003d6f3b4bf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.349 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7c6c0a-3da9-49a1-9235-46b4e9a3bb23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.350 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54268334-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.352 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 kernel: tap54268334-d0: left promiscuous mode
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.365 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.368 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[a27e8698-0f32-4a90-aaf2-bb17bdbfd02a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.374 187223 INFO nova.compute.manager [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.375 187223 DEBUG oslo.service.loopingcall [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.375 187223 DEBUG nova.compute.manager [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.376 187223 DEBUG nova.network.neutron [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.384 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5626ebb9-3e2f-4b2e-810d-e4fa3499a389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.386 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef1e39c-212d-423f-b94b-51b91d67cb29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.404 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4c462f-c144-4150-8edc-f93f27f06a6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529295, 'reachable_time': 20232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218817, 'error': None, 'target': 'ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 systemd[1]: run-netns-ovnmeta\x2d54268334\x2ddbc3\x2d41de\x2d8b55\x2d7e2418c08455.mount: Deactivated successfully.
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.408 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-54268334-dbc3-41de-8b55-7e2418c08455 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:20:39 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:39.409 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[c69744e9-188b-466b-b9f5-024889c406b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.796 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Received event network-vif-deleted-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.796 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Received event network-vif-plugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.797 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.797 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.797 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.798 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] No waiting events found dispatching network-vif-plugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.798 187223 WARNING nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Received unexpected event network-vif-plugged-6eff8ff5-0e39-45d2-9bdb-afaf49a7dc4c for instance with vm_state deleted and task_state None.#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.799 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-unplugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.799 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.799 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.800 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.800 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] No waiting events found dispatching network-vif-unplugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.800 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-unplugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.800 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.801 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.801 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.801 187223 DEBUG oslo_concurrency.lockutils [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.802 187223 DEBUG nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] No waiting events found dispatching network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.802 187223 WARNING nova.compute.manager [req-e9abd5d3-db6b-46d7-89b2-d374fdc13734 req-b08c6b6d-c334-458d-9726-1122aed16608 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received unexpected event network-vif-plugged-978970e7-0207-4f7c-a2e0-78a09ef9c57f for instance with vm_state active and task_state deleting.#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.857 187223 DEBUG nova.network.neutron [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.871 187223 INFO nova.compute.manager [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Took 0.49 seconds to deallocate network for instance.#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.930 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.930 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:39 np0005535656 nova_compute[187219]: 2025-11-25 19:20:39.985 187223 DEBUG nova.compute.provider_tree [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:20:40 np0005535656 nova_compute[187219]: 2025-11-25 19:20:40.002 187223 DEBUG nova.scheduler.client.report [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:20:40 np0005535656 nova_compute[187219]: 2025-11-25 19:20:40.025 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:40 np0005535656 nova_compute[187219]: 2025-11-25 19:20:40.058 187223 INFO nova.scheduler.client.report [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Deleted allocations for instance ef106817-b316-4585-877b-b4c688fcc3a0#033[00m
Nov 25 14:20:40 np0005535656 nova_compute[187219]: 2025-11-25 19:20:40.149 187223 DEBUG oslo_concurrency.lockutils [None req-a2525558-da69-4acb-87df-72f799181968 faa0e27f31f840699feb7befa5b86f95 b3851241b16047ed9445aa3074f8dc4c - - default default] Lock "ef106817-b316-4585-877b-b4c688fcc3a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:41 np0005535656 nova_compute[187219]: 2025-11-25 19:20:41.893 187223 DEBUG nova.compute.manager [req-2d326799-e4e9-4968-8144-3c5a37ad08a0 req-56a93265-5f44-4a7b-9889-dc2b83173cd5 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Received event network-vif-deleted-978970e7-0207-4f7c-a2e0-78a09ef9c57f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:20:43 np0005535656 nova_compute[187219]: 2025-11-25 19:20:43.937 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:44 np0005535656 nova_compute[187219]: 2025-11-25 19:20:44.301 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:45 np0005535656 podman[218819]: 2025-11-25 19:20:45.985893263 +0000 UTC m=+0.095396739 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 14:20:48 np0005535656 nova_compute[187219]: 2025-11-25 19:20:48.942 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:49 np0005535656 nova_compute[187219]: 2025-11-25 19:20:49.303 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:20:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:20:49 np0005535656 podman[218839]: 2025-11-25 19:20:49.983186519 +0000 UTC m=+0.098997147 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 14:20:50 np0005535656 nova_compute[187219]: 2025-11-25 19:20:50.905 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098435.9032397, 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:20:50 np0005535656 nova_compute[187219]: 2025-11-25 19:20:50.907 187223 INFO nova.compute.manager [-] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:20:50 np0005535656 nova_compute[187219]: 2025-11-25 19:20:50.936 187223 DEBUG nova.compute.manager [None req-d29510b9-5262-4a93-aac7-5459e53fdc5e - - - - - -] [instance: 82d4a3e7-9c1c-4c0c-a8fa-a593425c1af3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:20:53 np0005535656 nova_compute[187219]: 2025-11-25 19:20:53.994 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:54 np0005535656 nova_compute[187219]: 2025-11-25 19:20:54.257 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098439.255698, ef106817-b316-4585-877b-b4c688fcc3a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:20:54 np0005535656 nova_compute[187219]: 2025-11-25 19:20:54.258 187223 INFO nova.compute.manager [-] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:20:54 np0005535656 nova_compute[187219]: 2025-11-25 19:20:54.280 187223 DEBUG nova.compute.manager [None req-81f23637-ce7a-462a-a83c-b1586d60439f - - - - - -] [instance: ef106817-b316-4585-877b-b4c688fcc3a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:20:54 np0005535656 nova_compute[187219]: 2025-11-25 19:20:54.305 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:59 np0005535656 nova_compute[187219]: 2025-11-25 19:20:59.027 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:59.098 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:20:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:59.099 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:20:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:20:59.099 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:20:59 np0005535656 nova_compute[187219]: 2025-11-25 19:20:59.308 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:20:59 np0005535656 podman[218859]: 2025-11-25 19:20:59.982573809 +0000 UTC m=+0.090485857 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:21:04 np0005535656 nova_compute[187219]: 2025-11-25 19:21:04.070 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:04 np0005535656 nova_compute[187219]: 2025-11-25 19:21:04.311 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:05 np0005535656 podman[197580]: time="2025-11-25T19:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:21:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:21:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.127 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.312 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.684 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.685 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.685 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:21:09 np0005535656 nova_compute[187219]: 2025-11-25 19:21:09.699 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:21:09 np0005535656 ovn_controller[95460]: 2025-11-25T19:21:09Z|00193|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Nov 25 14:21:09 np0005535656 podman[218886]: 2025-11-25 19:21:09.986056461 +0000 UTC m=+0.100673362 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 14:21:10 np0005535656 podman[218885]: 2025-11-25 19:21:10.043281302 +0000 UTC m=+0.159904627 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:21:11 np0005535656 nova_compute[187219]: 2025-11-25 19:21:11.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:12 np0005535656 nova_compute[187219]: 2025-11-25 19:21:12.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:14 np0005535656 nova_compute[187219]: 2025-11-25 19:21:14.130 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:14 np0005535656 nova_compute[187219]: 2025-11-25 19:21:14.318 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:14 np0005535656 nova_compute[187219]: 2025-11-25 19:21:14.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:15 np0005535656 nova_compute[187219]: 2025-11-25 19:21:15.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:16 np0005535656 podman[218929]: 2025-11-25 19:21:16.944440385 +0000 UTC m=+0.066571865 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:21:17 np0005535656 nova_compute[187219]: 2025-11-25 19:21:17.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:18 np0005535656 nova_compute[187219]: 2025-11-25 19:21:18.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:18 np0005535656 nova_compute[187219]: 2025-11-25 19:21:18.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:21:19 np0005535656 nova_compute[187219]: 2025-11-25 19:21:19.132 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:19 np0005535656 nova_compute[187219]: 2025-11-25 19:21:19.321 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:21:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:21:19 np0005535656 nova_compute[187219]: 2025-11-25 19:21:19.859 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.693 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.693 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.694 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.694 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:21:20 np0005535656 podman[218952]: 2025-11-25 19:21:20.858840049 +0000 UTC m=+0.104623208 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.962 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.963 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.16271209716797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.964 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:21:20 np0005535656 nova_compute[187219]: 2025-11-25 19:21:20.964 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.111 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.111 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.245 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.268 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.301 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:21:21 np0005535656 nova_compute[187219]: 2025-11-25 19:21:21.301 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:21:22 np0005535656 nova_compute[187219]: 2025-11-25 19:21:22.298 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:21:24 np0005535656 nova_compute[187219]: 2025-11-25 19:21:24.179 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:24 np0005535656 nova_compute[187219]: 2025-11-25 19:21:24.323 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:24 np0005535656 nova_compute[187219]: 2025-11-25 19:21:24.900 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:24.901 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:21:24 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:24.903 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:21:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:28.906 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:21:29 np0005535656 nova_compute[187219]: 2025-11-25 19:21:29.219 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:29 np0005535656 nova_compute[187219]: 2025-11-25 19:21:29.324 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:30 np0005535656 podman[218973]: 2025-11-25 19:21:30.967764509 +0000 UTC m=+0.092673256 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:21:34 np0005535656 nova_compute[187219]: 2025-11-25 19:21:34.222 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:34 np0005535656 nova_compute[187219]: 2025-11-25 19:21:34.325 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:35 np0005535656 podman[197580]: time="2025-11-25T19:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:21:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:21:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 25 14:21:39 np0005535656 nova_compute[187219]: 2025-11-25 19:21:39.247 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:39 np0005535656 nova_compute[187219]: 2025-11-25 19:21:39.327 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:41 np0005535656 podman[218998]: 2025-11-25 19:21:41.002374828 +0000 UTC m=+0.103249110 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 25 14:21:41 np0005535656 podman[218997]: 2025-11-25 19:21:41.020015443 +0000 UTC m=+0.136092885 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:21:44 np0005535656 nova_compute[187219]: 2025-11-25 19:21:44.280 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:44 np0005535656 nova_compute[187219]: 2025-11-25 19:21:44.330 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:47 np0005535656 podman[219041]: 2025-11-25 19:21:47.96889299 +0000 UTC m=+0.090857577 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm)
Nov 25 14:21:49 np0005535656 nova_compute[187219]: 2025-11-25 19:21:49.281 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:49 np0005535656 nova_compute[187219]: 2025-11-25 19:21:49.331 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:21:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:21:51 np0005535656 podman[219063]: 2025-11-25 19:21:51.954335908 +0000 UTC m=+0.078038562 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 25 14:21:54 np0005535656 nova_compute[187219]: 2025-11-25 19:21:54.287 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:54 np0005535656 nova_compute[187219]: 2025-11-25 19:21:54.332 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:55 np0005535656 ovn_controller[95460]: 2025-11-25T19:21:55Z|00194|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 14:21:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:59.099 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:21:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:59.100 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:21:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:21:59.100 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:21:59 np0005535656 nova_compute[187219]: 2025-11-25 19:21:59.311 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:21:59 np0005535656 nova_compute[187219]: 2025-11-25 19:21:59.333 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:01 np0005535656 podman[219084]: 2025-11-25 19:22:01.969142586 +0000 UTC m=+0.075803312 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:22:04 np0005535656 nova_compute[187219]: 2025-11-25 19:22:04.318 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:04 np0005535656 nova_compute[187219]: 2025-11-25 19:22:04.333 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:05 np0005535656 podman[197580]: time="2025-11-25T19:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:22:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:22:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.334 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.336 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.337 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.337 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.363 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:09 np0005535656 nova_compute[187219]: 2025-11-25 19:22:09.364 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 14:22:10 np0005535656 nova_compute[187219]: 2025-11-25 19:22:10.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:10 np0005535656 nova_compute[187219]: 2025-11-25 19:22:10.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:22:10 np0005535656 nova_compute[187219]: 2025-11-25 19:22:10.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:22:10 np0005535656 nova_compute[187219]: 2025-11-25 19:22:10.687 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:22:11 np0005535656 nova_compute[187219]: 2025-11-25 19:22:11.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:11 np0005535656 podman[219110]: 2025-11-25 19:22:11.99055273 +0000 UTC m=+0.091470724 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 25 14:22:12 np0005535656 podman[219109]: 2025-11-25 19:22:12.029879089 +0000 UTC m=+0.137183585 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 25 14:22:13 np0005535656 nova_compute[187219]: 2025-11-25 19:22:13.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:14 np0005535656 nova_compute[187219]: 2025-11-25 19:22:14.365 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:15 np0005535656 nova_compute[187219]: 2025-11-25 19:22:15.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:16 np0005535656 nova_compute[187219]: 2025-11-25 19:22:16.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:17 np0005535656 nova_compute[187219]: 2025-11-25 19:22:17.669 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:18 np0005535656 nova_compute[187219]: 2025-11-25 19:22:18.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:18 np0005535656 nova_compute[187219]: 2025-11-25 19:22:18.671 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:22:18 np0005535656 podman[219154]: 2025-11-25 19:22:18.958805129 +0000 UTC m=+0.078202687 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.367 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.369 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.370 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.370 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.393 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.394 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.411 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.411 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.424 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:22:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.521 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.522 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.533 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.533 187223 INFO nova.compute.claims [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.671 187223 DEBUG nova.compute.provider_tree [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.684 187223 DEBUG nova.scheduler.client.report [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.700 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.700 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.762 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.762 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.787 187223 INFO nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.815 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.945 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.947 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.947 187223 INFO nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Creating image(s)#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.948 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.949 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.950 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:19 np0005535656 nova_compute[187219]: 2025-11-25 19:22:19.975 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.024 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.025 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.026 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.038 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.093 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.094 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.140 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.141 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.141 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.213 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.214 187223 DEBUG nova.virt.disk.api [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Checking if we can resize image /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.214 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.262 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.263 187223 DEBUG nova.virt.disk.api [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Cannot resize image /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.263 187223 DEBUG nova.objects.instance [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lazy-loading 'migration_context' on Instance uuid 619a8427-fc9c-471f-9812-a27c213f4523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.284 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.285 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Ensure instance console log exists: /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.285 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.285 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.286 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:20 np0005535656 nova_compute[187219]: 2025-11-25 19:22:20.664 187223 DEBUG nova.policy [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8823b60a0bdf456498433218e470fb7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c1fe2249527b46ceb38fa77fbc5aff54', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.693 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.694 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.694 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.695 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.935 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.937 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5879MB free_disk=73.16247177124023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.938 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:21 np0005535656 nova_compute[187219]: 2025-11-25 19:22:21.938 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.012 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Instance 619a8427-fc9c-471f-9812-a27c213f4523 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.012 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.012 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.053 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.067 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.091 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.091 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:22 np0005535656 nova_compute[187219]: 2025-11-25 19:22:22.685 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Successfully created port: 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:22:22 np0005535656 podman[219192]: 2025-11-25 19:22:22.974330446 +0000 UTC m=+0.083335404 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 25 14:22:23 np0005535656 nova_compute[187219]: 2025-11-25 19:22:23.092 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:22:23 np0005535656 nova_compute[187219]: 2025-11-25 19:22:23.925 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Successfully updated port: 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:22:23 np0005535656 nova_compute[187219]: 2025-11-25 19:22:23.946 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:22:23 np0005535656 nova_compute[187219]: 2025-11-25 19:22:23.947 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquired lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:22:23 np0005535656 nova_compute[187219]: 2025-11-25 19:22:23.947 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:22:24 np0005535656 nova_compute[187219]: 2025-11-25 19:22:24.096 187223 DEBUG nova.compute.manager [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-changed-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:24 np0005535656 nova_compute[187219]: 2025-11-25 19:22:24.096 187223 DEBUG nova.compute.manager [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Refreshing instance network info cache due to event network-changed-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:22:24 np0005535656 nova_compute[187219]: 2025-11-25 19:22:24.097 187223 DEBUG oslo_concurrency.lockutils [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:22:24 np0005535656 nova_compute[187219]: 2025-11-25 19:22:24.413 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:24 np0005535656 nova_compute[187219]: 2025-11-25 19:22:24.613 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:22:24 np0005535656 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.893 187223 DEBUG nova.network.neutron [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updating instance_info_cache with network_info: [{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.939 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Releasing lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.940 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Instance network_info: |[{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.941 187223 DEBUG oslo_concurrency.lockutils [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.941 187223 DEBUG nova.network.neutron [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Refreshing network info cache for port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.944 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Start _get_guest_xml network_info=[{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.949 187223 WARNING nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.955 187223 DEBUG nova.virt.libvirt.host [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.956 187223 DEBUG nova.virt.libvirt.host [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.960 187223 DEBUG nova.virt.libvirt.host [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.961 187223 DEBUG nova.virt.libvirt.host [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.962 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.962 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.962 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.963 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.963 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.963 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.964 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.964 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.964 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.965 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.965 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.965 187223 DEBUG nova.virt.hardware [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.970 187223 DEBUG nova.virt.libvirt.vif [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:22:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-646499507',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-646499507',id=28,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c1fe2249527b46ceb38fa77fbc5aff54',ramdisk_id='',reservation_id='r-mo3m0lqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:22:19Z,user_data=None,user_id='8823b60a0bdf456498433218e470fb7f',uuid=619a8427-fc9c-471f-9812-a27c213f4523,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.970 187223 DEBUG nova.network.os_vif_util [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Converting VIF {"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.971 187223 DEBUG nova.network.os_vif_util [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.972 187223 DEBUG nova.objects.instance [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lazy-loading 'pci_devices' on Instance uuid 619a8427-fc9c-471f-9812-a27c213f4523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.991 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <uuid>619a8427-fc9c-471f-9812-a27c213f4523</uuid>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <name>instance-0000001c</name>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-646499507</nova:name>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:22:25</nova:creationTime>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:user uuid="8823b60a0bdf456498433218e470fb7f">tempest-TestExecuteWorkloadBalancingStrategy-1652032292-project-member</nova:user>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:project uuid="c1fe2249527b46ceb38fa77fbc5aff54">tempest-TestExecuteWorkloadBalancingStrategy-1652032292</nova:project>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        <nova:port uuid="30daa0b3-22f3-4c1a-9e53-90d60a7ac48c">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="serial">619a8427-fc9c-471f-9812-a27c213f4523</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="uuid">619a8427-fc9c-471f-9812-a27c213f4523</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.config"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:7c:d9:c0"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <target dev="tap30daa0b3-22"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/console.log" append="off"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:22:25 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:22:25 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:22:25 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:22:25 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.992 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Preparing to wait for external event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.992 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.993 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.993 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.994 187223 DEBUG nova.virt.libvirt.vif [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:22:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-646499507',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-646499507',id=28,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c1fe2249527b46ceb38fa77fbc5aff54',ramdisk_id='',reservation_id='r-mo3m0lqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:22:19Z,user_data=None,user_id='8823b60a0bdf456498433218e470fb7f',uuid=619a8427-fc9c-471f-9812-a27c213f4523,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.994 187223 DEBUG nova.network.os_vif_util [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Converting VIF {"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.995 187223 DEBUG nova.network.os_vif_util [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.995 187223 DEBUG os_vif [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.996 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.996 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:25 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.997 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:25.999 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.000 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30daa0b3-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.000 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap30daa0b3-22, col_values=(('external_ids', {'iface-id': '30daa0b3-22f3-4c1a-9e53-90d60a7ac48c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:d9:c0', 'vm-uuid': '619a8427-fc9c-471f-9812-a27c213f4523'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.002 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:26 np0005535656 NetworkManager[55548]: <info>  [1764098546.0042] manager: (tap30daa0b3-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.006 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.012 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.013 187223 INFO os_vif [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22')#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.079 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.079 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.079 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] No VIF found with MAC fa:16:3e:7c:d9:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.080 187223 INFO nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Using config drive#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.849 187223 INFO nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Creating config drive at /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.config#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.858 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7pyj0uma execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:26 np0005535656 nova_compute[187219]: 2025-11-25 19:22:26.997 187223 DEBUG oslo_concurrency.processutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7pyj0uma" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:27 np0005535656 kernel: tap30daa0b3-22: entered promiscuous mode
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.0853] manager: (tap30daa0b3-22): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Nov 25 14:22:27 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:27Z|00195|binding|INFO|Claiming lport 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for this chassis.
Nov 25 14:22:27 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:27Z|00196|binding|INFO|30daa0b3-22f3-4c1a-9e53-90d60a7ac48c: Claiming fa:16:3e:7c:d9:c0 10.100.0.4
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.086 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.092 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.099 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.113 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:d9:c0 10.100.0.4'], port_security=['fa:16:3e:7c:d9:c0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '619a8427-fc9c-471f-9812-a27c213f4523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1fe2249527b46ceb38fa77fbc5aff54', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5905b5c6-8ab2-402f-a4ea-138384a0bcbf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af72c9a8-4619-4c6b-abfa-bcfd9a88042b, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.115 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c in datapath 2ec4a058-7af2-425a-bcc3-2491cdb7cf97 bound to our chassis#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.117 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ec4a058-7af2-425a-bcc3-2491cdb7cf97#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.134 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff08416-ec03-40be-bdd6-599179284821]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.135 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2ec4a058-71 in ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:22:27 np0005535656 systemd-udevd[219234]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.138 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2ec4a058-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.138 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[3f762d0d-44f9-47d9-bbd9-f42cbb91dfe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.139 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5af31737-b3e4-4d33-9fa9-fb0a21df6959]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 systemd-machined[153481]: New machine qemu-18-instance-0000001c.
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.157 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0aa8de-1687-4941-b645-c111aa6be522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.1586] device (tap30daa0b3-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.1595] device (tap30daa0b3-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:22:27 np0005535656 systemd[1]: Started Virtual Machine qemu-18-instance-0000001c.
Nov 25 14:22:27 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:27Z|00197|binding|INFO|Setting lport 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c ovn-installed in OVS
Nov 25 14:22:27 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:27Z|00198|binding|INFO|Setting lport 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c up in Southbound
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.188 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.192 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5411b66e-3344-4b22-88ba-a0fac9b3f361]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.229 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[99abcf98-5c62-4f23-8be4-2539ae4c399b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.236 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[868946dd-3a4f-425b-81ee-72ac28a94f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.2378] manager: (tap2ec4a058-70): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.275 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[21298279-fb4a-47c5-8d34-4c61c9f0989b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.278 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef96323-3e7c-41da-8718-19f4f4cfc5ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.3055] device (tap2ec4a058-70): carrier: link connected
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.315 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1c461f-3545-4c99-b439-9b957d1d469a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.339 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f61c14e9-6ed5-473a-8698-b4f172edc231]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ec4a058-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:8b:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550883, 'reachable_time': 31151, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219266, 'error': None, 'target': 'ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.355 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bb752452-8b0c-429f-b597-efb4fc299955]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec1:8be8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550883, 'tstamp': 550883}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219267, 'error': None, 'target': 'ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.372 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2f4cb8-2d5b-4ac7-92f0-e6296b4b604e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ec4a058-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:8b:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550883, 'reachable_time': 31151, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219268, 'error': None, 'target': 'ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.405 187223 DEBUG nova.compute.manager [req-35135b71-6367-475b-a9d4-4f9da76dae24 req-796530c2-5427-431c-8bf7-84cff2e343dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.406 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf6a0ff-161d-4b30-a87a-059c22713c83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.407 187223 DEBUG oslo_concurrency.lockutils [req-35135b71-6367-475b-a9d4-4f9da76dae24 req-796530c2-5427-431c-8bf7-84cff2e343dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.408 187223 DEBUG oslo_concurrency.lockutils [req-35135b71-6367-475b-a9d4-4f9da76dae24 req-796530c2-5427-431c-8bf7-84cff2e343dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.409 187223 DEBUG oslo_concurrency.lockutils [req-35135b71-6367-475b-a9d4-4f9da76dae24 req-796530c2-5427-431c-8bf7-84cff2e343dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.409 187223 DEBUG nova.compute.manager [req-35135b71-6367-475b-a9d4-4f9da76dae24 req-796530c2-5427-431c-8bf7-84cff2e343dc 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Processing event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.476 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4a74e8-2a83-4aad-818b-c2258d4f840a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.478 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ec4a058-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.478 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.479 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ec4a058-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.481 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 NetworkManager[55548]: <info>  [1764098547.4820] manager: (tap2ec4a058-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 25 14:22:27 np0005535656 kernel: tap2ec4a058-70: entered promiscuous mode
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.484 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ec4a058-70, col_values=(('external_ids', {'iface-id': '1ca82109-1588-49e1-8b56-67aaa650cde8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.485 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:27Z|00199|binding|INFO|Releasing lport 1ca82109-1588-49e1-8b56-67aaa650cde8 from this chassis (sb_readonly=0)
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.487 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2ec4a058-7af2-425a-bcc3-2491cdb7cf97.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2ec4a058-7af2-425a-bcc3-2491cdb7cf97.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.487 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb389cd-e85f-4f73-840e-466f41832a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.488 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-2ec4a058-7af2-425a-bcc3-2491cdb7cf97
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/2ec4a058-7af2-425a-bcc3-2491cdb7cf97.pid.haproxy
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 2ec4a058-7af2-425a-bcc3-2491cdb7cf97
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.489 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'env', 'PROCESS_TAG=haproxy-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2ec4a058-7af2-425a-bcc3-2491cdb7cf97.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.497 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:27.848 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.849 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.903 187223 DEBUG nova.network.neutron [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updated VIF entry in instance network info cache for port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.903 187223 DEBUG nova.network.neutron [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updating instance_info_cache with network_info: [{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:22:27 np0005535656 nova_compute[187219]: 2025-11-25 19:22:27.924 187223 DEBUG oslo_concurrency.lockutils [req-7c927d46-269f-453e-94f1-1c1f88a8f1b7 req-8e0a986e-6694-420e-92f4-eeaca4d92010 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:22:27 np0005535656 podman[219299]: 2025-11-25 19:22:27.923130042 +0000 UTC m=+0.063882491 container create 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 14:22:27 np0005535656 systemd[1]: Started libpod-conmon-3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b.scope.
Nov 25 14:22:27 np0005535656 podman[219299]: 2025-11-25 19:22:27.882599601 +0000 UTC m=+0.023352060 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:22:27 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:22:27 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/783896b8f6cfb98978d336a28ea0dc37877858603e2a3e2ccf3d07e85813c9da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.000 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.001 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098547.9994686, 619a8427-fc9c-471f-9812-a27c213f4523 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.001 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] VM Started (Lifecycle Event)#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.006 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.010 187223 INFO nova.virt.libvirt.driver [-] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Instance spawned successfully.#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.010 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:22:28 np0005535656 podman[219299]: 2025-11-25 19:22:28.0177792 +0000 UTC m=+0.158531679 container init 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 25 14:22:28 np0005535656 podman[219299]: 2025-11-25 19:22:28.033459872 +0000 UTC m=+0.174212321 container start 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.037 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.044 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.049 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.050 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.051 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.051 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.052 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.052 187223 DEBUG nova.virt.libvirt.driver [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:22:28 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [NOTICE]   (219326) : New worker (219328) forked
Nov 25 14:22:28 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [NOTICE]   (219326) : Loading success.
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.078 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.078 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098547.9996412, 619a8427-fc9c-471f-9812-a27c213f4523 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.079 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:22:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:28.106 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.121 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.125 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098548.0036812, 619a8427-fc9c-471f-9812-a27c213f4523 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.125 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.131 187223 INFO nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Took 8.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.132 187223 DEBUG nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.138 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.140 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.160 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.216 187223 INFO nova.compute.manager [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Took 8.73 seconds to build instance.#033[00m
Nov 25 14:22:28 np0005535656 nova_compute[187219]: 2025-11-25 19:22:28.235 187223 DEBUG oslo_concurrency.lockutils [None req-47a48835-40f1-453f-a0bf-29c48c81c533 8823b60a0bdf456498433218e470fb7f c1fe2249527b46ceb38fa77fbc5aff54 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.463 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.505 187223 DEBUG nova.compute.manager [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.506 187223 DEBUG oslo_concurrency.lockutils [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.506 187223 DEBUG oslo_concurrency.lockutils [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.507 187223 DEBUG oslo_concurrency.lockutils [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.508 187223 DEBUG nova.compute.manager [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:29 np0005535656 nova_compute[187219]: 2025-11-25 19:22:29.508 187223 WARNING nova.compute.manager [req-873ddcd8-44c4-4f7b-a99e-559a5b21fdef req-b702a1d4-c95b-4c58-9887-263d874bbaaf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state None.#033[00m
Nov 25 14:22:31 np0005535656 nova_compute[187219]: 2025-11-25 19:22:31.004 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:32 np0005535656 podman[219337]: 2025-11-25 19:22:32.975455854 +0000 UTC m=+0.087322082 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:22:34 np0005535656 nova_compute[187219]: 2025-11-25 19:22:34.466 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:35 np0005535656 podman[197580]: time="2025-11-25T19:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:22:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Nov 25 14:22:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Nov 25 14:22:36 np0005535656 nova_compute[187219]: 2025-11-25 19:22:36.008 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:36 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:36.109 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:37 np0005535656 nova_compute[187219]: 2025-11-25 19:22:37.347 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Check if temp file /var/lib/nova/instances/tmpt1on828x exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:22:37 np0005535656 nova_compute[187219]: 2025-11-25 19:22:37.348 187223 DEBUG nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt1on828x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='619a8427-fc9c-471f-9812-a27c213f4523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:22:38 np0005535656 nova_compute[187219]: 2025-11-25 19:22:38.752 187223 DEBUG oslo_concurrency.processutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:38 np0005535656 nova_compute[187219]: 2025-11-25 19:22:38.845 187223 DEBUG oslo_concurrency.processutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:38 np0005535656 nova_compute[187219]: 2025-11-25 19:22:38.846 187223 DEBUG oslo_concurrency.processutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:22:38 np0005535656 nova_compute[187219]: 2025-11-25 19:22:38.912 187223 DEBUG oslo_concurrency.processutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:22:39 np0005535656 nova_compute[187219]: 2025-11-25 19:22:39.516 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:40 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:40Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:d9:c0 10.100.0.4
Nov 25 14:22:40 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:40Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:d9:c0 10.100.0.4
Nov 25 14:22:41 np0005535656 nova_compute[187219]: 2025-11-25 19:22:41.010 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:42 np0005535656 podman[219394]: 2025-11-25 19:22:42.986312503 +0000 UTC m=+0.090993511 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 25 14:22:43 np0005535656 podman[219393]: 2025-11-25 19:22:43.052570277 +0000 UTC m=+0.161042497 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:22:43 np0005535656 systemd-logind[788]: New session 44 of user nova.
Nov 25 14:22:43 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:22:43 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:22:43 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:22:43 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:22:43 np0005535656 systemd[219444]: Queued start job for default target Main User Target.
Nov 25 14:22:43 np0005535656 systemd[219444]: Created slice User Application Slice.
Nov 25 14:22:43 np0005535656 systemd[219444]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:22:43 np0005535656 systemd[219444]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:22:43 np0005535656 systemd[219444]: Reached target Paths.
Nov 25 14:22:43 np0005535656 systemd[219444]: Reached target Timers.
Nov 25 14:22:43 np0005535656 systemd[219444]: Starting D-Bus User Message Bus Socket...
Nov 25 14:22:43 np0005535656 systemd[219444]: Starting Create User's Volatile Files and Directories...
Nov 25 14:22:43 np0005535656 systemd[219444]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:22:43 np0005535656 systemd[219444]: Reached target Sockets.
Nov 25 14:22:43 np0005535656 systemd[219444]: Finished Create User's Volatile Files and Directories.
Nov 25 14:22:43 np0005535656 systemd[219444]: Reached target Basic System.
Nov 25 14:22:43 np0005535656 systemd[219444]: Reached target Main User Target.
Nov 25 14:22:43 np0005535656 systemd[219444]: Startup finished in 165ms.
Nov 25 14:22:43 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:22:43 np0005535656 systemd[1]: Started Session 44 of User nova.
Nov 25 14:22:43 np0005535656 systemd[1]: session-44.scope: Deactivated successfully.
Nov 25 14:22:43 np0005535656 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Nov 25 14:22:43 np0005535656 systemd-logind[788]: Removed session 44.
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.519 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.876 187223 DEBUG nova.compute.manager [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.877 187223 DEBUG oslo_concurrency.lockutils [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.878 187223 DEBUG oslo_concurrency.lockutils [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.878 187223 DEBUG oslo_concurrency.lockutils [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.879 187223 DEBUG nova.compute.manager [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:44 np0005535656 nova_compute[187219]: 2025-11-25 19:22:44.879 187223 DEBUG nova.compute.manager [req-42e9222b-144d-47b3-ba61-020875ac4094 req-7cb0578f-338d-4087-973b-b8f8409a29e3 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.013 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.033 187223 INFO nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Took 7.12 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.034 187223 DEBUG nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.063 187223 DEBUG nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpt1on828x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='619a8427-fc9c-471f-9812-a27c213f4523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d7f169ab-508d-430b-92ac-2fa22c2d0b90),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.104 187223 DEBUG nova.objects.instance [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lazy-loading 'migration_context' on Instance uuid 619a8427-fc9c-471f-9812-a27c213f4523 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.106 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.109 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.110 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.132 187223 DEBUG nova.virt.libvirt.vif [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:22:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-646499507',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-646499507',id=28,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:22:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c1fe2249527b46ceb38fa77fbc5aff54',ramdisk_id='',reservation_id='r-mo3m0lqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:22:28Z,user_data=None,user_id='8823b60a0bdf456498433218e470fb7f',uuid=619a8427-fc9c-471f-9812-a27c213f4523,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.133 187223 DEBUG nova.network.os_vif_util [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converting VIF {"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.135 187223 DEBUG nova.network.os_vif_util [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.135 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:22:46 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:7c:d9:c0"/>
Nov 25 14:22:46 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:22:46 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:22:46 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:22:46 np0005535656 nova_compute[187219]:  <target dev="tap30daa0b3-22"/>
Nov 25 14:22:46 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:22:46 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.137 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.614 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.615 187223 INFO nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:22:46 np0005535656 nova_compute[187219]: 2025-11-25 19:22:46.714 187223 INFO nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.013 187223 DEBUG nova.compute.manager [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.013 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.015 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.015 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.016 187223 DEBUG nova.compute.manager [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.016 187223 WARNING nova.compute.manager [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.016 187223 DEBUG nova.compute.manager [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-changed-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.017 187223 DEBUG nova.compute.manager [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Refreshing instance network info cache due to event network-changed-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.017 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.017 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.017 187223 DEBUG nova.network.neutron [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Refreshing network info cache for port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.218 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.219 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.723 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:47 np0005535656 nova_compute[187219]: 2025-11-25 19:22:47.723 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:22:48 np0005535656 nova_compute[187219]: 2025-11-25 19:22:48.231 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:48 np0005535656 nova_compute[187219]: 2025-11-25 19:22:48.231 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:22:48 np0005535656 nova_compute[187219]: 2025-11-25 19:22:48.734 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:48 np0005535656 nova_compute[187219]: 2025-11-25 19:22:48.735 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.140 187223 DEBUG nova.network.neutron [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updated VIF entry in instance network info cache for port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.141 187223 DEBUG nova.network.neutron [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Updating instance_info_cache with network_info: [{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.176 187223 DEBUG oslo_concurrency.lockutils [req-ed071cc1-f898-4634-8330-1443868b2c3f req-18999a20-9795-4731-9c5c-169ec766230c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-619a8427-fc9c-471f-9812-a27c213f4523" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.193 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098569.1931133, 619a8427-fc9c-471f-9812-a27c213f4523 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.194 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.214 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.218 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.238 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.239 187223 DEBUG nova.virt.libvirt.migration [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.239 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:22:49 np0005535656 kernel: tap30daa0b3-22 (unregistering): left promiscuous mode
Nov 25 14:22:49 np0005535656 NetworkManager[55548]: <info>  [1764098569.3847] device (tap30daa0b3-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.427 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:49 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:49Z|00200|binding|INFO|Releasing lport 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c from this chassis (sb_readonly=0)
Nov 25 14:22:49 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:49Z|00201|binding|INFO|Setting lport 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c down in Southbound
Nov 25 14:22:49 np0005535656 ovn_controller[95460]: 2025-11-25T19:22:49Z|00202|binding|INFO|Removing iface tap30daa0b3-22 ovn-installed in OVS
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:22:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.437 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:d9:c0 10.100.0.4'], port_security=['fa:16:3e:7c:d9:c0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '619a8427-fc9c-471f-9812-a27c213f4523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1fe2249527b46ceb38fa77fbc5aff54', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5905b5c6-8ab2-402f-a4ea-138384a0bcbf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af72c9a8-4619-4c6b-abfa-bcfd9a88042b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.439 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c in datapath 2ec4a058-7af2-425a-bcc3-2491cdb7cf97 unbound from our chassis#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.441 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ec4a058-7af2-425a-bcc3-2491cdb7cf97, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.445 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e9caad36-2d6c-47eb-947c-0de288cb9fb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.446 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97 namespace which is not needed anymore#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.461 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:49 np0005535656 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 25 14:22:49 np0005535656 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Consumed 15.252s CPU time.
Nov 25 14:22:49 np0005535656 systemd-machined[153481]: Machine qemu-18-instance-0000001c terminated.
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.521 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:49 np0005535656 podman[219466]: 2025-11-25 19:22:49.55418222 +0000 UTC m=+0.104149565 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350)
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [NOTICE]   (219326) : haproxy version is 2.8.14-c23fe91
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [NOTICE]   (219326) : path to executable is /usr/sbin/haproxy
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [WARNING]  (219326) : Exiting Master process...
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [WARNING]  (219326) : Exiting Master process...
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.627 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.629 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.629 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [ALERT]    (219326) : Current worker (219328) exited with code 143 (Terminated)
Nov 25 14:22:49 np0005535656 neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97[219321]: [WARNING]  (219326) : All workers exited. Exiting... (0)
Nov 25 14:22:49 np0005535656 systemd[1]: libpod-3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b.scope: Deactivated successfully.
Nov 25 14:22:49 np0005535656 podman[219509]: 2025-11-25 19:22:49.64034553 +0000 UTC m=+0.056193904 container died 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 25 14:22:49 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b-userdata-shm.mount: Deactivated successfully.
Nov 25 14:22:49 np0005535656 systemd[1]: var-lib-containers-storage-overlay-783896b8f6cfb98978d336a28ea0dc37877858603e2a3e2ccf3d07e85813c9da-merged.mount: Deactivated successfully.
Nov 25 14:22:49 np0005535656 podman[219509]: 2025-11-25 19:22:49.68603254 +0000 UTC m=+0.101880944 container cleanup 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 25 14:22:49 np0005535656 systemd[1]: libpod-conmon-3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b.scope: Deactivated successfully.
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.741 187223 DEBUG nova.virt.libvirt.guest [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '619a8427-fc9c-471f-9812-a27c213f4523' (instance-0000001c) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.742 187223 INFO nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migration operation has completed#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.742 187223 INFO nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] _post_live_migration() is started..#033[00m
Nov 25 14:22:49 np0005535656 podman[219555]: 2025-11-25 19:22:49.766377293 +0000 UTC m=+0.055323330 container remove 3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.775 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[46c21631-9de6-4bce-a94c-ec77c937c865]: (4, ('Tue Nov 25 07:22:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97 (3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b)\n3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b\nTue Nov 25 07:22:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97 (3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b)\n3b2863ece537ed089766b9b144eaaa5aa526d688429b908aca9fe312f987102b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.777 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c27f046d-bfdc-45a9-9e86-1f1846a0b889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.777 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ec4a058-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.779 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:49 np0005535656 kernel: tap2ec4a058-70: left promiscuous mode
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.809 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.812 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[390324b5-7218-4c00-b9db-11ba803d03dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.828 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e193fb2f-2c74-41fa-94c4-2da5ac8aa8c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.830 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[519683c7-3d6c-48cf-be17-1c339c1497a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.858 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc3cdd5-0338-4392-a713-9b696dab320d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550875, 'reachable_time': 18640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219573, 'error': None, 'target': 'ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 systemd[1]: run-netns-ovnmeta\x2d2ec4a058\x2d7af2\x2d425a\x2dbcc3\x2d2491cdb7cf97.mount: Deactivated successfully.
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.862 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2ec4a058-7af2-425a-bcc3-2491cdb7cf97 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:22:49 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:49.862 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[3d42f65e-fafa-4487-b819-4c73f8e65b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.971 187223 DEBUG nova.compute.manager [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.971 187223 DEBUG oslo_concurrency.lockutils [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.972 187223 DEBUG oslo_concurrency.lockutils [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.973 187223 DEBUG oslo_concurrency.lockutils [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.973 187223 DEBUG nova.compute.manager [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:49 np0005535656 nova_compute[187219]: 2025-11-25 19:22:49.973 187223 DEBUG nova.compute.manager [req-63009a4b-594c-49b0-81bd-e410ffbf2e7c req-eb7479be-1d7a-4701-9b26-b38502b0080c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.474 187223 DEBUG nova.compute.manager [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.475 187223 DEBUG oslo_concurrency.lockutils [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.475 187223 DEBUG oslo_concurrency.lockutils [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.476 187223 DEBUG oslo_concurrency.lockutils [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.476 187223 DEBUG nova.compute.manager [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:50 np0005535656 nova_compute[187219]: 2025-11-25 19:22:50.476 187223 DEBUG nova.compute.manager [req-824d2390-8afb-4523-b119-6020fbb23f54 req-765a7e77-6281-43c2-88ba-2281793b885c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-unplugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.017 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.190 187223 DEBUG nova.network.neutron [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Activated binding for port 30daa0b3-22f3-4c1a-9e53-90d60a7ac48c and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.191 187223 DEBUG nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.192 187223 DEBUG nova.virt.libvirt.vif [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:22:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-646499507',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-646499507',id=28,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:22:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c1fe2249527b46ceb38fa77fbc5aff54',ramdisk_id='',reservation_id='r-mo3m0lqd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1652032292-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:22:34Z,user_data=None,user_id='8823b60a0bdf456498433218e470fb7f',uuid=619a8427-fc9c-471f-9812-a27c213f4523,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.192 187223 DEBUG nova.network.os_vif_util [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converting VIF {"id": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "address": "fa:16:3e:7c:d9:c0", "network": {"id": "2ec4a058-7af2-425a-bcc3-2491cdb7cf97", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1430632150-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c1fe2249527b46ceb38fa77fbc5aff54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30daa0b3-22", "ovs_interfaceid": "30daa0b3-22f3-4c1a-9e53-90d60a7ac48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.193 187223 DEBUG nova.network.os_vif_util [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.194 187223 DEBUG os_vif [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.196 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.196 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30daa0b3-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.198 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.200 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.204 187223 INFO os_vif [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:d9:c0,bridge_name='br-int',has_traffic_filtering=True,id=30daa0b3-22f3-4c1a-9e53-90d60a7ac48c,network=Network(2ec4a058-7af2-425a-bcc3-2491cdb7cf97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30daa0b3-22')#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.204 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.205 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.205 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.205 187223 DEBUG nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.206 187223 INFO nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Deleting instance files /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523_del#033[00m
Nov 25 14:22:51 np0005535656 nova_compute[187219]: 2025-11-25 19:22:51.208 187223 INFO nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Deletion of /var/lib/nova/instances/619a8427-fc9c-471f-9812-a27c213f4523_del complete#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.112 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.113 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.113 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.114 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.114 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.115 187223 WARNING nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.115 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.115 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.116 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.116 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.117 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.117 187223 WARNING nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.117 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.118 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.118 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.119 187223 DEBUG oslo_concurrency.lockutils [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.119 187223 DEBUG nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.119 187223 WARNING nova.compute.manager [req-12159956-0317-4387-bde3-20def4c0220e req-5c76d9cb-315a-4010-a174-442f6d6fefa9 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.574 187223 DEBUG nova.compute.manager [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.575 187223 DEBUG oslo_concurrency.lockutils [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.575 187223 DEBUG oslo_concurrency.lockutils [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.576 187223 DEBUG oslo_concurrency.lockutils [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.576 187223 DEBUG nova.compute.manager [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] No waiting events found dispatching network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:22:52 np0005535656 nova_compute[187219]: 2025-11-25 19:22:52.576 187223 WARNING nova.compute.manager [req-071ef030-f4d1-42b7-8cbd-d53c5ba39d7b req-86d5f079-ea6b-4401-a7a2-e6ee7958e508 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Received unexpected event network-vif-plugged-30daa0b3-22f3-4c1a-9e53-90d60a7ac48c for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:22:53 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:22:53 np0005535656 systemd[219444]: Activating special unit Exit the Session...
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped target Main User Target.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped target Basic System.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped target Paths.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped target Sockets.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped target Timers.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:22:53 np0005535656 systemd[219444]: Closed D-Bus User Message Bus Socket.
Nov 25 14:22:53 np0005535656 systemd[219444]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:22:53 np0005535656 systemd[219444]: Removed slice User Application Slice.
Nov 25 14:22:53 np0005535656 systemd[219444]: Reached target Shutdown.
Nov 25 14:22:53 np0005535656 systemd[219444]: Finished Exit the Session.
Nov 25 14:22:53 np0005535656 systemd[219444]: Reached target Exit the Session.
Nov 25 14:22:53 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:22:53 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:22:53 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:22:53 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:22:53 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:22:53 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:22:53 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:22:53 np0005535656 podman[219575]: 2025-11-25 19:22:53.994378201 +0000 UTC m=+0.092003498 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:22:54 np0005535656 nova_compute[187219]: 2025-11-25 19:22:54.524 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.897 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "619a8427-fc9c-471f-9812-a27c213f4523-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.898 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.898 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "619a8427-fc9c-471f-9812-a27c213f4523-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.931 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.932 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:55 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.932 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:55 np0005535656 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 14:22:55 np0005535656 nova_compute[187219]: 2025-11-25 19:22:55.933 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.176 187223 WARNING nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.178 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5862MB free_disk=73.16272354125977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.178 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.179 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.200 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.328 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Migration for instance 619a8427-fc9c-471f-9812-a27c213f4523 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.346 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.380 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Migration d7f169ab-508d-430b-92ac-2fa22c2d0b90 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.381 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.381 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.418 187223 DEBUG nova.compute.provider_tree [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.434 187223 DEBUG nova.scheduler.client.report [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.462 187223 DEBUG nova.compute.resource_tracker [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.463 187223 DEBUG oslo_concurrency.lockutils [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.471 187223 INFO nova.compute.manager [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.599 187223 INFO nova.scheduler.client.report [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] Deleted allocation for migration d7f169ab-508d-430b-92ac-2fa22c2d0b90#033[00m
Nov 25 14:22:56 np0005535656 nova_compute[187219]: 2025-11-25 19:22:56.600 187223 DEBUG nova.virt.libvirt.driver [None req-b79d0af4-6d97-4b63-a7e7-b1b6f1a9adf5 352890ed10204ab38fbf1580c5923e9d 33f53d124a7c4ae592fb023bc424705e - - default default] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:22:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:59.100 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:22:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:59.101 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:22:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:22:59.101 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:22:59 np0005535656 nova_compute[187219]: 2025-11-25 19:22:59.551 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:01 np0005535656 nova_compute[187219]: 2025-11-25 19:23:01.202 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:03 np0005535656 podman[219600]: 2025-11-25 19:23:03.959037347 +0000 UTC m=+0.067674163 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:23:04 np0005535656 nova_compute[187219]: 2025-11-25 19:23:04.591 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:04 np0005535656 nova_compute[187219]: 2025-11-25 19:23:04.626 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098569.6247964, 619a8427-fc9c-471f-9812-a27c213f4523 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:23:04 np0005535656 nova_compute[187219]: 2025-11-25 19:23:04.626 187223 INFO nova.compute.manager [-] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:23:04 np0005535656 nova_compute[187219]: 2025-11-25 19:23:04.654 187223 DEBUG nova.compute.manager [None req-70c538db-4e25-4adb-a34a-4aa89b30d18c - - - - - -] [instance: 619a8427-fc9c-471f-9812-a27c213f4523] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:23:05 np0005535656 podman[197580]: time="2025-11-25T19:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:23:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:23:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Nov 25 14:23:06 np0005535656 nova_compute[187219]: 2025-11-25 19:23:06.205 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:09 np0005535656 nova_compute[187219]: 2025-11-25 19:23:09.596 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:11 np0005535656 nova_compute[187219]: 2025-11-25 19:23:11.208 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:12 np0005535656 nova_compute[187219]: 2025-11-25 19:23:12.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:12 np0005535656 nova_compute[187219]: 2025-11-25 19:23:12.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:23:12 np0005535656 nova_compute[187219]: 2025-11-25 19:23:12.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:23:12 np0005535656 nova_compute[187219]: 2025-11-25 19:23:12.698 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:23:13 np0005535656 nova_compute[187219]: 2025-11-25 19:23:13.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:13 np0005535656 podman[219625]: 2025-11-25 19:23:13.953714973 +0000 UTC m=+0.066543252 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 14:23:14 np0005535656 podman[219624]: 2025-11-25 19:23:14.022066314 +0000 UTC m=+0.131671386 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 14:23:14 np0005535656 nova_compute[187219]: 2025-11-25 19:23:14.641 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:14 np0005535656 nova_compute[187219]: 2025-11-25 19:23:14.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:15 np0005535656 nova_compute[187219]: 2025-11-25 19:23:15.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:16 np0005535656 nova_compute[187219]: 2025-11-25 19:23:16.211 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:18 np0005535656 nova_compute[187219]: 2025-11-25 19:23:18.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:18 np0005535656 nova_compute[187219]: 2025-11-25 19:23:18.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:23:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:23:19 np0005535656 nova_compute[187219]: 2025-11-25 19:23:19.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:19 np0005535656 nova_compute[187219]: 2025-11-25 19:23:19.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:23:19 np0005535656 nova_compute[187219]: 2025-11-25 19:23:19.677 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:19 np0005535656 podman[219670]: 2025-11-25 19:23:19.971709346 +0000 UTC m=+0.088352490 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6)
Nov 25 14:23:21 np0005535656 nova_compute[187219]: 2025-11-25 19:23:21.215 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:21 np0005535656 nova_compute[187219]: 2025-11-25 19:23:21.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.702 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.703 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.704 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.704 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.909 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.910 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5873MB free_disk=73.15946960449219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.911 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.911 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.996 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:23:22 np0005535656 nova_compute[187219]: 2025-11-25 19:23:22.997 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.017 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing inventories for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.044 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating ProviderTree inventory for provider 752b63a7-2ce2-4d83-a281-12c9803714ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.045 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Updating inventory in ProviderTree for provider 752b63a7-2ce2-4d83-a281-12c9803714ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.066 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing aggregate associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.095 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Refreshing trait associations for resource provider 752b63a7-2ce2-4d83-a281-12c9803714ea, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.122 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.141 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.145 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:23:23 np0005535656 nova_compute[187219]: 2025-11-25 19:23:23.146 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:23:24 np0005535656 nova_compute[187219]: 2025-11-25 19:23:24.148 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:23:24 np0005535656 nova_compute[187219]: 2025-11-25 19:23:24.683 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:25 np0005535656 podman[219691]: 2025-11-25 19:23:25.008912182 +0000 UTC m=+0.122851819 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:23:26 np0005535656 nova_compute[187219]: 2025-11-25 19:23:26.219 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:29 np0005535656 nova_compute[187219]: 2025-11-25 19:23:29.723 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:31 np0005535656 nova_compute[187219]: 2025-11-25 19:23:31.221 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:34 np0005535656 nova_compute[187219]: 2025-11-25 19:23:34.776 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:34 np0005535656 podman[219712]: 2025-11-25 19:23:34.969364926 +0000 UTC m=+0.083339125 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 14:23:35 np0005535656 podman[197580]: time="2025-11-25T19:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:23:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:23:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 25 14:23:36 np0005535656 nova_compute[187219]: 2025-11-25 19:23:36.224 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:39 np0005535656 nova_compute[187219]: 2025-11-25 19:23:39.778 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:41 np0005535656 nova_compute[187219]: 2025-11-25 19:23:41.227 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:44 np0005535656 nova_compute[187219]: 2025-11-25 19:23:44.779 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:44 np0005535656 podman[219738]: 2025-11-25 19:23:44.912823502 +0000 UTC m=+0.086536201 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 14:23:44 np0005535656 podman[219737]: 2025-11-25 19:23:44.947060243 +0000 UTC m=+0.125976913 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 25 14:23:46 np0005535656 nova_compute[187219]: 2025-11-25 19:23:46.229 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:23:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:23:49 np0005535656 nova_compute[187219]: 2025-11-25 19:23:49.823 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:50 np0005535656 podman[219781]: 2025-11-25 19:23:50.985911468 +0000 UTC m=+0.103422406 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, version=9.6)
Nov 25 14:23:51 np0005535656 ovn_controller[95460]: 2025-11-25T19:23:51Z|00203|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 14:23:51 np0005535656 nova_compute[187219]: 2025-11-25 19:23:51.231 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:54 np0005535656 nova_compute[187219]: 2025-11-25 19:23:54.887 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:55 np0005535656 podman[219802]: 2025-11-25 19:23:55.974560406 +0000 UTC m=+0.081673100 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 25 14:23:56 np0005535656 nova_compute[187219]: 2025-11-25 19:23:56.234 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:23:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:23:59.102 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:23:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:23:59.102 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:23:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:23:59.102 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:23:59 np0005535656 nova_compute[187219]: 2025-11-25 19:23:59.946 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:01 np0005535656 nova_compute[187219]: 2025-11-25 19:24:01.237 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:04 np0005535656 nova_compute[187219]: 2025-11-25 19:24:04.979 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:05 np0005535656 podman[197580]: time="2025-11-25T19:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:24:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:24:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Nov 25 14:24:05 np0005535656 podman[219823]: 2025-11-25 19:24:05.94325098 +0000 UTC m=+0.068250849 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:24:06 np0005535656 nova_compute[187219]: 2025-11-25 19:24:06.239 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:10 np0005535656 nova_compute[187219]: 2025-11-25 19:24:10.028 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:11.127 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:24:11 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:11.128 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:24:11 np0005535656 nova_compute[187219]: 2025-11-25 19:24:11.173 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:11 np0005535656 nova_compute[187219]: 2025-11-25 19:24:11.241 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:13 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:13.129 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:13 np0005535656 nova_compute[187219]: 2025-11-25 19:24:13.219 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:13 np0005535656 nova_compute[187219]: 2025-11-25 19:24:13.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:14 np0005535656 nova_compute[187219]: 2025-11-25 19:24:14.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:14 np0005535656 nova_compute[187219]: 2025-11-25 19:24:14.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:24:14 np0005535656 nova_compute[187219]: 2025-11-25 19:24:14.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:24:14 np0005535656 nova_compute[187219]: 2025-11-25 19:24:14.692 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:24:14 np0005535656 nova_compute[187219]: 2025-11-25 19:24:14.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:15 np0005535656 nova_compute[187219]: 2025-11-25 19:24:15.031 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:15 np0005535656 nova_compute[187219]: 2025-11-25 19:24:15.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:15 np0005535656 podman[219850]: 2025-11-25 19:24:15.950199615 +0000 UTC m=+0.064288522 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:24:15 np0005535656 podman[219849]: 2025-11-25 19:24:15.960227765 +0000 UTC m=+0.081407703 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 25 14:24:16 np0005535656 nova_compute[187219]: 2025-11-25 19:24:16.242 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:24:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:24:19 np0005535656 nova_compute[187219]: 2025-11-25 19:24:19.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:19 np0005535656 nova_compute[187219]: 2025-11-25 19:24:19.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:24:20 np0005535656 nova_compute[187219]: 2025-11-25 19:24:20.063 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:20 np0005535656 nova_compute[187219]: 2025-11-25 19:24:20.668 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:20 np0005535656 nova_compute[187219]: 2025-11-25 19:24:20.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:21 np0005535656 nova_compute[187219]: 2025-11-25 19:24:21.273 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:21 np0005535656 podman[219896]: 2025-11-25 19:24:21.961413284 +0000 UTC m=+0.077637931 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.733 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.733 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.733 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.734 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.937 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.938 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5887MB free_disk=73.15946960449219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.938 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:24 np0005535656 nova_compute[187219]: 2025-11-25 19:24:24.939 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.049 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.050 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.103 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.107 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.135 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.137 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:24:25 np0005535656 nova_compute[187219]: 2025-11-25 19:24:25.138 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:26 np0005535656 nova_compute[187219]: 2025-11-25 19:24:26.276 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:26 np0005535656 podman[219917]: 2025-11-25 19:24:26.974953604 +0000 UTC m=+0.084093996 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Nov 25 14:24:30 np0005535656 nova_compute[187219]: 2025-11-25 19:24:30.130 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:31 np0005535656 nova_compute[187219]: 2025-11-25 19:24:31.317 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:35 np0005535656 nova_compute[187219]: 2025-11-25 19:24:35.188 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:35 np0005535656 podman[197580]: time="2025-11-25T19:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:24:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:24:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Nov 25 14:24:36 np0005535656 nova_compute[187219]: 2025-11-25 19:24:36.347 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:36 np0005535656 podman[219939]: 2025-11-25 19:24:36.965479356 +0000 UTC m=+0.080055566 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.230 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.673 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.674 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.675 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.675 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.676 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.676 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.708 187223 DEBUG nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.709 187223 WARNING nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.709 187223 INFO nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Removable base files: /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.710 187223 INFO nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.710 187223 DEBUG nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.710 187223 DEBUG nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 25 14:24:40 np0005535656 nova_compute[187219]: 2025-11-25 19:24:40.710 187223 DEBUG nova.virt.libvirt.imagecache [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 25 14:24:41 np0005535656 nova_compute[187219]: 2025-11-25 19:24:41.350 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:45 np0005535656 nova_compute[187219]: 2025-11-25 19:24:45.232 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:46 np0005535656 nova_compute[187219]: 2025-11-25 19:24:46.353 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:46 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:46Z|00204|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 14:24:46 np0005535656 podman[219965]: 2025-11-25 19:24:46.965884394 +0000 UTC m=+0.076166613 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 25 14:24:47 np0005535656 podman[219964]: 2025-11-25 19:24:47.01662669 +0000 UTC m=+0.131291957 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:24:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:24:50 np0005535656 nova_compute[187219]: 2025-11-25 19:24:50.236 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:51 np0005535656 nova_compute[187219]: 2025-11-25 19:24:51.360 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.233 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.233 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.251 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.341 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.342 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.351 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.352 187223 INFO nova.compute.claims [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.476 187223 DEBUG nova.compute.provider_tree [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.590 187223 DEBUG nova.scheduler.client.report [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.636 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.637 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.691 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.692 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.716 187223 INFO nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.738 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.823 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.825 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.826 187223 INFO nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Creating image(s)#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.827 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.827 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.828 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.856 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.950 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.951 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.952 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:52 np0005535656 nova_compute[187219]: 2025-11-25 19:24:52.977 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:52 np0005535656 podman[220010]: 2025-11-25 19:24:52.982812558 +0000 UTC m=+0.090659052 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.029 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.030 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.075 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473,backing_fmt=raw /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.076 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "f9afa418d80acffaa10fd2196926f68dcc1e2473" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.076 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.126 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f9afa418d80acffaa10fd2196926f68dcc1e2473 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.127 187223 DEBUG nova.virt.disk.api [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Checking if we can resize image /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.128 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.177 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.179 187223 DEBUG nova.virt.disk.api [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Cannot resize image /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.179 187223 DEBUG nova.objects.instance [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lazy-loading 'migration_context' on Instance uuid 71c4cd6e-6474-4a98-91a6-f61169eb7b8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.200 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.201 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Ensure instance console log exists: /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.201 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.202 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.202 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:53 np0005535656 nova_compute[187219]: 2025-11-25 19:24:53.778 187223 DEBUG nova.policy [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a222e3c58fcf4706ab56cbca2847c233', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3476b4f1173485eb848b834c8dd8cf9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 25 14:24:55 np0005535656 nova_compute[187219]: 2025-11-25 19:24:55.239 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:55 np0005535656 nova_compute[187219]: 2025-11-25 19:24:55.363 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Successfully created port: 070356c2-cd5d-4830-ae1b-b6767843f668 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.032 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Successfully updated port: 070356c2-cd5d-4830-ae1b-b6767843f668 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.062 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.062 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquired lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.063 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.143 187223 DEBUG nova.compute.manager [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-changed-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.143 187223 DEBUG nova.compute.manager [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Refreshing instance network info cache due to event network-changed-070356c2-cd5d-4830-ae1b-b6767843f668. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.144 187223 DEBUG oslo_concurrency.lockutils [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.362 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:56 np0005535656 nova_compute[187219]: 2025-11-25 19:24:56.684 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.774 187223 DEBUG nova.network.neutron [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating instance_info_cache with network_info: [{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.806 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Releasing lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.807 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Instance network_info: |[{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.808 187223 DEBUG oslo_concurrency.lockutils [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.808 187223 DEBUG nova.network.neutron [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Refreshing network info cache for port 070356c2-cd5d-4830-ae1b-b6767843f668 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.814 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Start _get_guest_xml network_info=[{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'image_id': '1ea5e141-b92c-44f3-97b7-7b313587d3bf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.819 187223 WARNING nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.825 187223 DEBUG nova.virt.libvirt.host [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.826 187223 DEBUG nova.virt.libvirt.host [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.835 187223 DEBUG nova.virt.libvirt.host [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.836 187223 DEBUG nova.virt.libvirt.host [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.837 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.838 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T18:49:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a7ebe884-489b-45b6-89a1-4967aa291cd6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T18:49:28Z,direct_url=<?>,disk_format='qcow2',id=1ea5e141-b92c-44f3-97b7-7b313587d3bf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='33f53d124a7c4ae592fb023bc424705e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T18:49:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.839 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.839 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.840 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.840 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.840 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.841 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.842 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.842 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.842 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.843 187223 DEBUG nova.virt.hardware [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.850 187223 DEBUG nova.virt.libvirt.vif [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-848740995',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-848740995',id=29,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3476b4f1173485eb848b834c8dd8cf9',ramdisk_id='',reservation_id='r-u668r4aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1998892509',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1998892509-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:24:52Z,user_data=None,user_id='a222e3c58fcf4706ab56cbca2847c233',uuid=71c4cd6e-6474-4a98-91a6-f61169eb7b8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.850 187223 DEBUG nova.network.os_vif_util [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Converting VIF {"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.851 187223 DEBUG nova.network.os_vif_util [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.853 187223 DEBUG nova.objects.instance [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 71c4cd6e-6474-4a98-91a6-f61169eb7b8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.883 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] End _get_guest_xml xml=<domain type="kvm">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <uuid>71c4cd6e-6474-4a98-91a6-f61169eb7b8f</uuid>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <name>instance-0000001d</name>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <memory>131072</memory>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <vcpu>1</vcpu>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <metadata>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-848740995</nova:name>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:creationTime>2025-11-25 19:24:57</nova:creationTime>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:flavor name="m1.nano">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:memory>128</nova:memory>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:disk>1</nova:disk>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:swap>0</nova:swap>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:ephemeral>0</nova:ephemeral>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:vcpus>1</nova:vcpus>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      </nova:flavor>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:owner>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:user uuid="a222e3c58fcf4706ab56cbca2847c233">tempest-TestExecuteZoneMigrationStrategy-1998892509-project-member</nova:user>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:project uuid="d3476b4f1173485eb848b834c8dd8cf9">tempest-TestExecuteZoneMigrationStrategy-1998892509</nova:project>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      </nova:owner>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:root type="image" uuid="1ea5e141-b92c-44f3-97b7-7b313587d3bf"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <nova:ports>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        <nova:port uuid="070356c2-cd5d-4830-ae1b-b6767843f668">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:        </nova:port>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      </nova:ports>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </nova:instance>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </metadata>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <sysinfo type="smbios">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <system>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="manufacturer">RDO</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="product">OpenStack Compute</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="serial">71c4cd6e-6474-4a98-91a6-f61169eb7b8f</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="uuid">71c4cd6e-6474-4a98-91a6-f61169eb7b8f</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <entry name="family">Virtual Machine</entry>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </system>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </sysinfo>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <os>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <boot dev="hd"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <smbios mode="sysinfo"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </os>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <features>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <acpi/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <apic/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <vmcoreinfo/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </features>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <clock offset="utc">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <timer name="pit" tickpolicy="delay"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <timer name="hpet" present="no"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </clock>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <cpu mode="custom" match="exact">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <model>Nehalem</model>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <topology sockets="1" cores="1" threads="1"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </cpu>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  <devices>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <disk type="file" device="disk">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <target dev="vda" bus="virtio"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <disk type="file" device="cdrom">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <driver name="qemu" type="raw" cache="none"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <source file="/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.config"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <target dev="sda" bus="sata"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </disk>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <interface type="ethernet">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <mac address="fa:16:3e:58:20:ee"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <mtu size="1442"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <target dev="tap070356c2-cd"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </interface>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <serial type="pty">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <log file="/var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/console.log" append="off"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </serial>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <video>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <model type="virtio"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </video>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <input type="tablet" bus="usb"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <rng model="virtio">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <backend model="random">/dev/urandom</backend>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </rng>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="pci" model="pcie-root-port"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <controller type="usb" index="0"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    <memballoon model="virtio">
Nov 25 14:24:57 np0005535656 nova_compute[187219]:      <stats period="10"/>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:    </memballoon>
Nov 25 14:24:57 np0005535656 nova_compute[187219]:  </devices>
Nov 25 14:24:57 np0005535656 nova_compute[187219]: </domain>
Nov 25 14:24:57 np0005535656 nova_compute[187219]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.884 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Preparing to wait for external event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.884 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.885 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.885 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.886 187223 DEBUG nova.virt.libvirt.vif [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-25T19:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-848740995',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-848740995',id=29,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d3476b4f1173485eb848b834c8dd8cf9',ramdisk_id='',reservation_id='r-u668r4aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1998892509',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1998892509-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:24:52Z,user_data=None,user_id='a222e3c58fcf4706ab56cbca2847c233',uuid=71c4cd6e-6474-4a98-91a6-f61169eb7b8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.886 187223 DEBUG nova.network.os_vif_util [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Converting VIF {"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.887 187223 DEBUG nova.network.os_vif_util [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.888 187223 DEBUG os_vif [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.889 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.889 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.890 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.894 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.894 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap070356c2-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.894 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap070356c2-cd, col_values=(('external_ids', {'iface-id': '070356c2-cd5d-4830-ae1b-b6767843f668', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:20:ee', 'vm-uuid': '71c4cd6e-6474-4a98-91a6-f61169eb7b8f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.896 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:57 np0005535656 NetworkManager[55548]: <info>  [1764098697.8978] manager: (tap070356c2-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.898 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.907 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.908 187223 INFO os_vif [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd')#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.974 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.974 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.975 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] No VIF found with MAC fa:16:3e:58:20:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 25 14:24:57 np0005535656 nova_compute[187219]: 2025-11-25 19:24:57.975 187223 INFO nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Using config drive#033[00m
Nov 25 14:24:57 np0005535656 podman[220046]: 2025-11-25 19:24:57.979341449 +0000 UTC m=+0.093246922 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.340 187223 INFO nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Creating config drive at /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.config#033[00m
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.349 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa3fu8lax execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.491 187223 DEBUG oslo_concurrency.processutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa3fu8lax" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:24:58 np0005535656 kernel: tap070356c2-cd: entered promiscuous mode
Nov 25 14:24:58 np0005535656 NetworkManager[55548]: <info>  [1764098698.5907] manager: (tap070356c2-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 25 14:24:58 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:58Z|00205|binding|INFO|Claiming lport 070356c2-cd5d-4830-ae1b-b6767843f668 for this chassis.
Nov 25 14:24:58 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:58Z|00206|binding|INFO|070356c2-cd5d-4830-ae1b-b6767843f668: Claiming fa:16:3e:58:20:ee 10.100.0.4
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.590 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.596 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.606 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.620 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:20:ee 10.100.0.4'], port_security=['fa:16:3e:58:20:ee 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '71c4cd6e-6474-4a98-91a6-f61169eb7b8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3476b4f1173485eb848b834c8dd8cf9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b6935ea1-8197-489e-be98-f0ddd731f478', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=304e8e7f-4bda-4ddc-87e4-15ec0aabe909, chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=070356c2-cd5d-4830-ae1b-b6767843f668) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.622 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 070356c2-cd5d-4830-ae1b-b6767843f668 in datapath 0595ca0a-0994-45d7-b765-c6d94beda8f0 bound to our chassis#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.625 104346 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0595ca0a-0994-45d7-b765-c6d94beda8f0#033[00m
Nov 25 14:24:58 np0005535656 systemd-udevd[220085]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.643 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6450bcea-ce5d-446e-bd19-4e92e2284392]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.645 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0595ca0a-01 in ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.648 208749 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0595ca0a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.648 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[e2df74b4-3c2f-429b-9652-c43c3ab37fc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.649 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef7d02e-c73d-41da-be2c-6aeed8c04b38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 systemd-machined[153481]: New machine qemu-19-instance-0000001d.
Nov 25 14:24:58 np0005535656 NetworkManager[55548]: <info>  [1764098698.6596] device (tap070356c2-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 14:24:58 np0005535656 NetworkManager[55548]: <info>  [1764098698.6605] device (tap070356c2-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.667 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff29e24-525b-4738-ac8b-9d44853cad36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 systemd[1]: Started Virtual Machine qemu-19-instance-0000001d.
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.694 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:58 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:58Z|00207|binding|INFO|Setting lport 070356c2-cd5d-4830-ae1b-b6767843f668 ovn-installed in OVS
Nov 25 14:24:58 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:58Z|00208|binding|INFO|Setting lport 070356c2-cd5d-4830-ae1b-b6767843f668 up in Southbound
Nov 25 14:24:58 np0005535656 nova_compute[187219]: 2025-11-25 19:24:58.699 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.702 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[50268a91-9796-4ed2-8f19-d76ffbe472aa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.754 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[d89e84ef-3043-4b46-bb05-2c6bfb543e58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.764 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca4c0d4-87f6-45b6-a5b6-0bb947319988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 NetworkManager[55548]: <info>  [1764098698.7675] manager: (tap0595ca0a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.819 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8dc6e2-3cc6-405e-a7bf-5d8dd8078755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.826 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5ab705-c987-4877-83d1-91afb6dc3894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 NetworkManager[55548]: <info>  [1764098698.8582] device (tap0595ca0a-00): carrier: link connected
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.866 208774 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae90119-38cd-480c-9b52-c35619a3e49d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.886 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7a2dec-fe3c-4b07-a5ba-0943714299a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0595ca0a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566038, 'reachable_time': 25726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220118, 'error': None, 'target': 'ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.911 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[9e30c054-9078-4e4c-a20b-06f02d0731bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:5596'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 566038, 'tstamp': 566038}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220119, 'error': None, 'target': 'ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.930 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bd7b0f-a4d2-4a21-864f-5c2356a2503c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0595ca0a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:55:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566038, 'reachable_time': 25726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220120, 'error': None, 'target': 'ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:58 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:58.974 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[1947f383-ade0-4a39-af94-6cce54d39c08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.050 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[adf6cb71-d998-4991-b5cf-d7910be5236c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.053 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0595ca0a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.053 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.054 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0595ca0a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:59 np0005535656 NetworkManager[55548]: <info>  [1764098699.0899] manager: (tap0595ca0a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Nov 25 14:24:59 np0005535656 kernel: tap0595ca0a-00: entered promiscuous mode
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.089 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.094 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.095 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0595ca0a-00, col_values=(('external_ids', {'iface-id': '588a5b53-b813-4ff8-9d3c-40ef74d1eb9a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:24:59 np0005535656 ovn_controller[95460]: 2025-11-25T19:24:59Z|00209|binding|INFO|Releasing lport 588a5b53-b813-4ff8-9d3c-40ef74d1eb9a from this chassis (sb_readonly=0)
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.097 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.103 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.104 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.105 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.120 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.125 104346 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0595ca0a-0994-45d7-b765-c6d94beda8f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0595ca0a-0994-45d7-b765-c6d94beda8f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.127 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[73b3f974-fd5a-49e7-91be-209d653e7f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.128 104346 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: global
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    log         /dev/log local0 debug
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    log-tag     haproxy-metadata-proxy-0595ca0a-0994-45d7-b765-c6d94beda8f0
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    user        root
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    group       root
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    maxconn     1024
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    pidfile     /var/lib/neutron/external/pids/0595ca0a-0994-45d7-b765-c6d94beda8f0.pid.haproxy
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    daemon
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: defaults
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    log global
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    mode http
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    option httplog
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    option dontlognull
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    option http-server-close
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    option forwardfor
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    retries                 3
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    timeout http-request    30s
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    timeout connect         30s
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    timeout client          32s
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    timeout server          32s
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    timeout http-keep-alive 30s
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: listen listener
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    bind 169.254.169.254:80
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    server metadata /var/lib/neutron/metadata_proxy
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]:    http-request add-header X-OVN-Network-ID 0595ca0a-0994-45d7-b765-c6d94beda8f0
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 25 14:24:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:24:59.130 104346 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'env', 'PROCESS_TAG=haproxy-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0595ca0a-0994-45d7-b765-c6d94beda8f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.154 187223 DEBUG nova.compute.manager [req-f6acaa94-1cdf-46e8-aca1-975ff9082cce req-33d85eca-c96e-4f5a-b177-b34d9f0cf255 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.155 187223 DEBUG oslo_concurrency.lockutils [req-f6acaa94-1cdf-46e8-aca1-975ff9082cce req-33d85eca-c96e-4f5a-b177-b34d9f0cf255 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.155 187223 DEBUG oslo_concurrency.lockutils [req-f6acaa94-1cdf-46e8-aca1-975ff9082cce req-33d85eca-c96e-4f5a-b177-b34d9f0cf255 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.156 187223 DEBUG oslo_concurrency.lockutils [req-f6acaa94-1cdf-46e8-aca1-975ff9082cce req-33d85eca-c96e-4f5a-b177-b34d9f0cf255 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.156 187223 DEBUG nova.compute.manager [req-f6acaa94-1cdf-46e8-aca1-975ff9082cce req-33d85eca-c96e-4f5a-b177-b34d9f0cf255 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Processing event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.393 187223 DEBUG nova.network.neutron [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updated VIF entry in instance network info cache for port 070356c2-cd5d-4830-ae1b-b6767843f668. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.394 187223 DEBUG nova.network.neutron [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating instance_info_cache with network_info: [{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.407 187223 DEBUG oslo_concurrency.lockutils [req-ab965eb0-da4b-4433-bbab-774ffc67cdd1 req-535286c7-fea1-4382-a4ff-be1f36bd8c1c 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.492 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.494 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098699.493087, 71c4cd6e-6474-4a98-91a6-f61169eb7b8f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.494 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] VM Started (Lifecycle Event)#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.498 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.504 187223 INFO nova.virt.libvirt.driver [-] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Instance spawned successfully.#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.504 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.530 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.536 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.539 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.540 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.540 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.540 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.541 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.541 187223 DEBUG nova.virt.libvirt.driver [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.575 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.575 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098699.493661, 71c4cd6e-6474-4a98-91a6-f61169eb7b8f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.576 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.613 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.618 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098699.4979541, 71c4cd6e-6474-4a98-91a6-f61169eb7b8f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.619 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] VM Resumed (Lifecycle Event)#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.624 187223 INFO nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Took 6.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.625 187223 DEBUG nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:24:59 np0005535656 podman[220159]: 2025-11-25 19:24:59.643918737 +0000 UTC m=+0.084779763 container create de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.680 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.686 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:24:59 np0005535656 systemd[1]: Started libpod-conmon-de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059.scope.
Nov 25 14:24:59 np0005535656 podman[220159]: 2025-11-25 19:24:59.604840615 +0000 UTC m=+0.045701741 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 25 14:24:59 np0005535656 systemd[1]: Started libcrun container.
Nov 25 14:24:59 np0005535656 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fb8b86ce8639468a0d06aa098fa6e5971e6e0e42b4c0df9969fc79b0c76db88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.734 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 25 14:24:59 np0005535656 podman[220159]: 2025-11-25 19:24:59.73795878 +0000 UTC m=+0.178819836 container init de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 25 14:24:59 np0005535656 podman[220159]: 2025-11-25 19:24:59.74432155 +0000 UTC m=+0.185182606 container start de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.749 187223 INFO nova.compute.manager [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Took 7.44 seconds to build instance.#033[00m
Nov 25 14:24:59 np0005535656 nova_compute[187219]: 2025-11-25 19:24:59.776 187223 DEBUG oslo_concurrency.lockutils [None req-422189fa-7847-4f1b-bf0b-f362ab5f3a37 a222e3c58fcf4706ab56cbca2847c233 d3476b4f1173485eb848b834c8dd8cf9 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:24:59 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [NOTICE]   (220178) : New worker (220180) forked
Nov 25 14:24:59 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [NOTICE]   (220178) : Loading success.
Nov 25 14:25:00 np0005535656 nova_compute[187219]: 2025-11-25 19:25:00.242 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.265 187223 DEBUG nova.compute.manager [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.266 187223 DEBUG oslo_concurrency.lockutils [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.267 187223 DEBUG oslo_concurrency.lockutils [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.267 187223 DEBUG oslo_concurrency.lockutils [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.268 187223 DEBUG nova.compute.manager [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:01 np0005535656 nova_compute[187219]: 2025-11-25 19:25:01.268 187223 WARNING nova.compute.manager [req-7b1e54e5-31ff-454f-8c1d-ae77c8385a89 req-44e398ea-12c5-42ad-98d1-712412b05cf0 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received unexpected event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with vm_state active and task_state None.#033[00m
Nov 25 14:25:02 np0005535656 nova_compute[187219]: 2025-11-25 19:25:02.898 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:05 np0005535656 nova_compute[187219]: 2025-11-25 19:25:05.274 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:05 np0005535656 podman[197580]: time="2025-11-25T19:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:25:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 25 14:25:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 25 14:25:07 np0005535656 nova_compute[187219]: 2025-11-25 19:25:07.902 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:07 np0005535656 podman[220190]: 2025-11-25 19:25:07.976989863 +0000 UTC m=+0.088480033 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 14:25:10 np0005535656 nova_compute[187219]: 2025-11-25 19:25:10.315 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:25:11Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:20:ee 10.100.0.4
Nov 25 14:25:11 np0005535656 ovn_controller[95460]: 2025-11-25T19:25:11Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:20:ee 10.100.0.4
Nov 25 14:25:12 np0005535656 nova_compute[187219]: 2025-11-25 19:25:12.288 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Check if temp file /var/lib/nova/instances/tmpua1zvo4_ exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 25 14:25:12 np0005535656 nova_compute[187219]: 2025-11-25 19:25:12.289 187223 DEBUG nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpua1zvo4_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='71c4cd6e-6474-4a98-91a6-f61169eb7b8f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 25 14:25:12 np0005535656 nova_compute[187219]: 2025-11-25 19:25:12.907 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:13 np0005535656 nova_compute[187219]: 2025-11-25 19:25:13.299 187223 DEBUG oslo_concurrency.processutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:25:13 np0005535656 nova_compute[187219]: 2025-11-25 19:25:13.392 187223 DEBUG oslo_concurrency.processutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:25:13 np0005535656 nova_compute[187219]: 2025-11-25 19:25:13.393 187223 DEBUG oslo_concurrency.processutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 25 14:25:13 np0005535656 nova_compute[187219]: 2025-11-25 19:25:13.459 187223 DEBUG oslo_concurrency.processutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 25 14:25:13 np0005535656 nova_compute[187219]: 2025-11-25 19:25:13.710 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:15 np0005535656 nova_compute[187219]: 2025-11-25 19:25:15.319 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:16 np0005535656 systemd-logind[788]: New session 46 of user nova.
Nov 25 14:25:16 np0005535656 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 14:25:16 np0005535656 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 14:25:16 np0005535656 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 14:25:16 np0005535656 systemd[1]: Starting User Manager for UID 42436...
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:25:16 np0005535656 systemd[220244]: Queued start job for default target Main User Target.
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.708 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.708 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquired lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.709 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 25 14:25:16 np0005535656 nova_compute[187219]: 2025-11-25 19:25:16.709 187223 DEBUG nova.objects.instance [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 71c4cd6e-6474-4a98-91a6-f61169eb7b8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:25:16 np0005535656 systemd[220244]: Created slice User Application Slice.
Nov 25 14:25:16 np0005535656 systemd[220244]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:25:16 np0005535656 systemd[220244]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 14:25:16 np0005535656 systemd[220244]: Reached target Paths.
Nov 25 14:25:16 np0005535656 systemd[220244]: Reached target Timers.
Nov 25 14:25:16 np0005535656 systemd[220244]: Starting D-Bus User Message Bus Socket...
Nov 25 14:25:16 np0005535656 systemd[220244]: Starting Create User's Volatile Files and Directories...
Nov 25 14:25:16 np0005535656 systemd[220244]: Finished Create User's Volatile Files and Directories.
Nov 25 14:25:16 np0005535656 systemd[220244]: Listening on D-Bus User Message Bus Socket.
Nov 25 14:25:16 np0005535656 systemd[220244]: Reached target Sockets.
Nov 25 14:25:16 np0005535656 systemd[220244]: Reached target Basic System.
Nov 25 14:25:16 np0005535656 systemd[220244]: Reached target Main User Target.
Nov 25 14:25:16 np0005535656 systemd[220244]: Startup finished in 169ms.
Nov 25 14:25:16 np0005535656 systemd[1]: Started User Manager for UID 42436.
Nov 25 14:25:16 np0005535656 systemd[1]: Started Session 46 of User nova.
Nov 25 14:25:16 np0005535656 systemd[1]: session-46.scope: Deactivated successfully.
Nov 25 14:25:16 np0005535656 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Nov 25 14:25:16 np0005535656 systemd-logind[788]: Removed session 46.
Nov 25 14:25:17 np0005535656 nova_compute[187219]: 2025-11-25 19:25:17.911 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:17 np0005535656 podman[220262]: 2025-11-25 19:25:17.99704186 +0000 UTC m=+0.110187637 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.009 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:18 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:18.008 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:25:18 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:18.011 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.027 187223 DEBUG nova.compute.manager [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.028 187223 DEBUG oslo_concurrency.lockutils [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.028 187223 DEBUG oslo_concurrency.lockutils [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.028 187223 DEBUG oslo_concurrency.lockutils [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.028 187223 DEBUG nova.compute.manager [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.028 187223 DEBUG nova.compute.manager [req-6390ce2f-f909-4266-bd8e-5a3a4dfc7574 req-0fb37edb-0aed-4cfb-931d-41ec51aa49bf 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:25:18 np0005535656 podman[220261]: 2025-11-25 19:25:18.037830119 +0000 UTC m=+0.145121479 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.632 187223 INFO nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Took 5.17 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.633 187223 DEBUG nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.649 187223 DEBUG nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpua1zvo4_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='71c4cd6e-6474-4a98-91a6-f61169eb7b8f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(27db95be-1bf5-40c0-b1c8-9b141c122d58),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.674 187223 DEBUG nova.objects.instance [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lazy-loading 'migration_context' on Instance uuid 71c4cd6e-6474-4a98-91a6-f61169eb7b8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.676 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.678 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.678 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.698 187223 DEBUG nova.virt.libvirt.vif [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-848740995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-848740995',id=29,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:24:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d3476b4f1173485eb848b834c8dd8cf9',ramdisk_id='',reservation_id='r-u668r4aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1998892509',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1998892509-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:24:59Z,user_data=None,user_id='a222e3c58fcf4706ab56cbca2847c233',uuid=71c4cd6e-6474-4a98-91a6-f61169eb7b8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.699 187223 DEBUG nova.network.os_vif_util [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.700 187223 DEBUG nova.network.os_vif_util [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.700 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 14:25:18 np0005535656 nova_compute[187219]:  <mac address="fa:16:3e:58:20:ee"/>
Nov 25 14:25:18 np0005535656 nova_compute[187219]:  <model type="virtio"/>
Nov 25 14:25:18 np0005535656 nova_compute[187219]:  <driver name="vhost" rx_queue_size="512"/>
Nov 25 14:25:18 np0005535656 nova_compute[187219]:  <mtu size="1442"/>
Nov 25 14:25:18 np0005535656 nova_compute[187219]:  <target dev="tap070356c2-cd"/>
Nov 25 14:25:18 np0005535656 nova_compute[187219]: </interface>
Nov 25 14:25:18 np0005535656 nova_compute[187219]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.701 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.843 187223 DEBUG nova.network.neutron [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating instance_info_cache with network_info: [{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.868 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Releasing lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.868 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.868 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:18 np0005535656 nova_compute[187219]: 2025-11-25 19:25:18.869 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:19 np0005535656 nova_compute[187219]: 2025-11-25 19:25:19.181 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:19 np0005535656 nova_compute[187219]: 2025-11-25 19:25:19.181 187223 INFO nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 25 14:25:19 np0005535656 nova_compute[187219]: 2025-11-25 19:25:19.236 187223 INFO nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:25:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:25:19 np0005535656 nova_compute[187219]: 2025-11-25 19:25:19.769 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:19 np0005535656 nova_compute[187219]: 2025-11-25 19:25:19.770 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.141 187223 DEBUG nova.compute.manager [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.142 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.142 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.142 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.143 187223 DEBUG nova.compute.manager [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.143 187223 WARNING nova.compute.manager [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received unexpected event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.143 187223 DEBUG nova.compute.manager [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-changed-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.144 187223 DEBUG nova.compute.manager [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Refreshing instance network info cache due to event network-changed-070356c2-cd5d-4830-ae1b-b6767843f668. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.144 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.144 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquired lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.145 187223 DEBUG nova.network.neutron [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Refreshing network info cache for port 070356c2-cd5d-4830-ae1b-b6767843f668 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.274 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.275 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.321 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.779 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:20 np0005535656 nova_compute[187219]: 2025-11-25 19:25:20.779 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.284 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.285 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.491 187223 DEBUG nova.network.neutron [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updated VIF entry in instance network info cache for port 070356c2-cd5d-4830-ae1b-b6767843f668. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.492 187223 DEBUG nova.network.neutron [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating instance_info_cache with network_info: [{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.520 187223 DEBUG oslo_concurrency.lockutils [req-b33684de-5eb1-4d91-9d1c-1ea7829af3cd req-f365bac2-3fa6-4c2b-9522-18ee95163a4d 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Releasing lock "refresh_cache-71c4cd6e-6474-4a98-91a6-f61169eb7b8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.688 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.789 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:21 np0005535656 nova_compute[187219]: 2025-11-25 19:25:21.790 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:22 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:22.014 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.294 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.294 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.688 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.798 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.799 187223 DEBUG nova.virt.libvirt.migration [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.889 187223 DEBUG nova.virt.driver [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] Emitting event <LifecycleEvent: 1764098722.8888285, 71c4cd6e-6474-4a98-91a6-f61169eb7b8f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.890 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] VM Paused (Lifecycle Event)#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.917 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.943 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.948 187223 DEBUG nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 25 14:25:22 np0005535656 nova_compute[187219]: 2025-11-25 19:25:22.974 187223 INFO nova.compute.manager [None req-624cffa4-e9df-422d-a901-543c535d43e2 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 25 14:25:23 np0005535656 kernel: tap070356c2-cd (unregistering): left promiscuous mode
Nov 25 14:25:23 np0005535656 NetworkManager[55548]: <info>  [1764098723.0527] device (tap070356c2-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.070 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:25:23Z|00210|binding|INFO|Releasing lport 070356c2-cd5d-4830-ae1b-b6767843f668 from this chassis (sb_readonly=0)
Nov 25 14:25:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:25:23Z|00211|binding|INFO|Setting lport 070356c2-cd5d-4830-ae1b-b6767843f668 down in Southbound
Nov 25 14:25:23 np0005535656 ovn_controller[95460]: 2025-11-25T19:25:23Z|00212|binding|INFO|Removing iface tap070356c2-cd ovn-installed in OVS
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.073 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.103 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Nov 25 14:25:23 np0005535656 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Consumed 14.086s CPU time.
Nov 25 14:25:23 np0005535656 systemd-machined[153481]: Machine qemu-19-instance-0000001d terminated.
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.122 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:20:ee 10.100.0.4'], port_security=['fa:16:3e:58:20:ee 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e972f2ff-26b4-4f8a-a1c4-86615f1f7462'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '71c4cd6e-6474-4a98-91a6-f61169eb7b8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3476b4f1173485eb848b834c8dd8cf9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b6935ea1-8197-489e-be98-f0ddd731f478', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=304e8e7f-4bda-4ddc-87e4-15ec0aabe909, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>], logical_port=070356c2-cd5d-4830-ae1b-b6767843f668) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f89972f09d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.125 104346 INFO neutron.agent.ovn.metadata.agent [-] Port 070356c2-cd5d-4830-ae1b-b6767843f668 in datapath 0595ca0a-0994-45d7-b765-c6d94beda8f0 unbound from our chassis#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.127 104346 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0595ca0a-0994-45d7-b765-c6d94beda8f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.130 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[63ef5633-c9c9-411e-9e49-931dcf9a5f96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.131 104346 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0 namespace which is not needed anymore#033[00m
Nov 25 14:25:23 np0005535656 podman[220307]: 2025-11-25 19:25:23.164226215 +0000 UTC m=+0.081713721 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.243 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.252 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.279 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.280 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.280 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.301 187223 DEBUG nova.virt.libvirt.guest [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '71c4cd6e-6474-4a98-91a6-f61169eb7b8f' (instance-0000001d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.301 187223 INFO nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migration operation has completed#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.301 187223 INFO nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] _post_live_migration() is started..#033[00m
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [NOTICE]   (220178) : haproxy version is 2.8.14-c23fe91
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [NOTICE]   (220178) : path to executable is /usr/sbin/haproxy
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [WARNING]  (220178) : Exiting Master process...
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [WARNING]  (220178) : Exiting Master process...
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [ALERT]    (220178) : Current worker (220180) exited with code 143 (Terminated)
Nov 25 14:25:23 np0005535656 neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0[220174]: [WARNING]  (220178) : All workers exited. Exiting... (0)
Nov 25 14:25:23 np0005535656 systemd[1]: libpod-de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059.scope: Deactivated successfully.
Nov 25 14:25:23 np0005535656 podman[220353]: 2025-11-25 19:25:23.31336 +0000 UTC m=+0.058304620 container died de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:25:23 np0005535656 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059-userdata-shm.mount: Deactivated successfully.
Nov 25 14:25:23 np0005535656 systemd[1]: var-lib-containers-storage-overlay-6fb8b86ce8639468a0d06aa098fa6e5971e6e0e42b4c0df9969fc79b0c76db88-merged.mount: Deactivated successfully.
Nov 25 14:25:23 np0005535656 podman[220353]: 2025-11-25 19:25:23.355143635 +0000 UTC m=+0.100088285 container cleanup de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 25 14:25:23 np0005535656 systemd[1]: libpod-conmon-de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059.scope: Deactivated successfully.
Nov 25 14:25:23 np0005535656 podman[220394]: 2025-11-25 19:25:23.441345606 +0000 UTC m=+0.051949009 container remove de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.447 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[c10fe10e-7006-48c2-8327-1b43e249ca21]: (4, ('Tue Nov 25 07:25:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0 (de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059)\nde6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059\nTue Nov 25 07:25:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0 (de6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059)\nde6c6694aa2d1bdb005ceef203479eb3a77dc3a3b8f7311c1739cdfe999f3059\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.450 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfc192a-622f-4e58-8917-78391db3fb2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.451 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0595ca0a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.454 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 kernel: tap0595ca0a-00: left promiscuous mode
Nov 25 14:25:23 np0005535656 nova_compute[187219]: 2025-11-25 19:25:23.482 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.485 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[91b30493-0b94-474b-8d30-a7b1dd0f456d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.503 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[bd423523-e08c-4368-bb7d-db6fbf0c83be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.505 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[6ccf1af3-4466-4e66-89b8-a1b920ecc1be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.527 208749 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d42c9f-987d-4cf0-a448-caa0aa25bb46]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566026, 'reachable_time': 38176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220410, 'error': None, 'target': 'ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:23 np0005535656 systemd[1]: run-netns-ovnmeta\x2d0595ca0a\x2d0994\x2d45d7\x2db765\x2dc6d94beda8f0.mount: Deactivated successfully.
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.531 104456 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0595ca0a-0994-45d7-b765-c6d94beda8f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 25 14:25:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:23.531 104456 DEBUG oslo.privsep.daemon [-] privsep: reply[03a43855-ee07-44c8-86f9-617f481225ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.700 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.700 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.855 187223 DEBUG nova.compute.manager [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.856 187223 DEBUG oslo_concurrency.lockutils [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.856 187223 DEBUG oslo_concurrency.lockutils [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.856 187223 DEBUG oslo_concurrency.lockutils [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.856 187223 DEBUG nova.compute.manager [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.857 187223 DEBUG nova.compute.manager [req-44237c51-f664-44a8-9de2-eac8d11e4e15 req-2831aee5-bef1-4e1f-aacb-3475265d859b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.905 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.906 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5847MB free_disk=73.13079071044922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.906 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.906 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.955 187223 INFO nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Updating resource usage from migration 27db95be-1bf5-40c0-b1c8-9b141c122d58#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.968 187223 DEBUG nova.compute.manager [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.969 187223 DEBUG oslo_concurrency.lockutils [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.969 187223 DEBUG oslo_concurrency.lockutils [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.969 187223 DEBUG oslo_concurrency.lockutils [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.970 187223 DEBUG nova.compute.manager [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.970 187223 DEBUG nova.compute.manager [req-fff073ab-7f20-4573-ac1e-ed91624244e9 req-8e808e06-feb5-4380-89ab-8342c589e09b 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-unplugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.998 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Migration 27db95be-1bf5-40c0-b1c8-9b141c122d58 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:25:24 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.999 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:24.999 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.050 187223 DEBUG nova.network.neutron [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Activated binding for port 070356c2-cd5d-4830-ae1b-b6767843f668 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.051 187223 DEBUG nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.052 187223 DEBUG nova.virt.libvirt.vif [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-25T19:24:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-848740995',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-848740995',id=29,image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:24:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d3476b4f1173485eb848b834c8dd8cf9',ramdisk_id='',reservation_id='r-u668r4aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='1ea5e141-b92c-44f3-97b7-7b313587d3bf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1998892509',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1998892509-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:25:09Z,user_data=None,user_id='a222e3c58fcf4706ab56cbca2847c233',uuid=71c4cd6e-6474-4a98-91a6-f61169eb7b8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.052 187223 DEBUG nova.network.os_vif_util [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converting VIF {"id": "070356c2-cd5d-4830-ae1b-b6767843f668", "address": "fa:16:3e:58:20:ee", "network": {"id": "0595ca0a-0994-45d7-b765-c6d94beda8f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1182114068-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d3476b4f1173485eb848b834c8dd8cf9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap070356c2-cd", "ovs_interfaceid": "070356c2-cd5d-4830-ae1b-b6767843f668", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.053 187223 DEBUG nova.network.os_vif_util [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.054 187223 DEBUG os_vif [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.056 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.057 187223 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap070356c2-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.058 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.061 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.064 187223 INFO os_vif [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:20:ee,bridge_name='br-int',has_traffic_filtering=True,id=070356c2-cd5d-4830-ae1b-b6767843f668,network=Network(0595ca0a-0994-45d7-b765-c6d94beda8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap070356c2-cd')#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.064 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.134 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.158 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.192 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.192 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.193 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.193 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.194 187223 DEBUG nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.194 187223 INFO nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Deleting instance files /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f_del#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.195 187223 INFO nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Deletion of /var/lib/nova/instances/71c4cd6e-6474-4a98-91a6-f61169eb7b8f_del complete#033[00m
Nov 25 14:25:25 np0005535656 nova_compute[187219]: 2025-11-25 19:25:25.325 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:26 np0005535656 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 14:25:26 np0005535656 systemd[220244]: Activating special unit Exit the Session...
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped target Main User Target.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped target Basic System.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped target Paths.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped target Sockets.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped target Timers.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 14:25:26 np0005535656 systemd[220244]: Closed D-Bus User Message Bus Socket.
Nov 25 14:25:26 np0005535656 systemd[220244]: Stopped Create User's Volatile Files and Directories.
Nov 25 14:25:26 np0005535656 systemd[220244]: Removed slice User Application Slice.
Nov 25 14:25:26 np0005535656 systemd[220244]: Reached target Shutdown.
Nov 25 14:25:26 np0005535656 systemd[220244]: Finished Exit the Session.
Nov 25 14:25:26 np0005535656 systemd[220244]: Reached target Exit the Session.
Nov 25 14:25:26 np0005535656 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 14:25:26 np0005535656 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 14:25:26 np0005535656 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 14:25:26 np0005535656 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 14:25:26 np0005535656 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 14:25:26 np0005535656 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 14:25:26 np0005535656 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.942 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.943 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.943 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.944 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.944 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.944 187223 WARNING nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received unexpected event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.945 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.945 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.945 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.946 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.946 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.946 187223 WARNING nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received unexpected event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.946 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.947 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.947 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.947 187223 DEBUG oslo_concurrency.lockutils [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.948 187223 DEBUG nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] No waiting events found dispatching network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 25 14:25:26 np0005535656 nova_compute[187219]: 2025-11-25 19:25:26.948 187223 WARNING nova.compute.manager [req-cdc1891f-1c8b-4ff1-be2d-2e1779e9f82f req-ee8ddcab-e0a1-43ec-9ec9-75f641cbbd19 434dedd106824ce7ade622ad00da89b5 e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Received unexpected event network-vif-plugged-070356c2-cd5d-4830-ae1b-b6767843f668 for instance with vm_state active and task_state migrating.#033[00m
Nov 25 14:25:28 np0005535656 podman[220414]: 2025-11-25 19:25:28.977092075 +0000 UTC m=+0.089215233 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 14:25:29 np0005535656 nova_compute[187219]: 2025-11-25 19:25:29.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:29 np0005535656 nova_compute[187219]: 2025-11-25 19:25:29.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 25 14:25:30 np0005535656 nova_compute[187219]: 2025-11-25 19:25:30.060 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:30 np0005535656 nova_compute[187219]: 2025-11-25 19:25:30.370 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.693 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.694 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.713 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.811 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.812 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.812 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "71c4cd6e-6474-4a98-91a6-f61169eb7b8f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.844 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.845 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.845 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:32 np0005535656 nova_compute[187219]: 2025-11-25 19:25:32.845 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.088 187223 WARNING nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.092 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5861MB free_disk=73.15942001342773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.092 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.093 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.263 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration for instance 71c4cd6e-6474-4a98-91a6-f61169eb7b8f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.281 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.306 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Migration 27db95be-1bf5-40c0-b1c8-9b141c122d58 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.307 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.307 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.348 187223 DEBUG nova.compute.provider_tree [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.365 187223 DEBUG nova.scheduler.client.report [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.385 187223 DEBUG nova.compute.resource_tracker [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.386 187223 DEBUG oslo_concurrency.lockutils [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.393 187223 INFO nova.compute.manager [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.478 187223 INFO nova.scheduler.client.report [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] Deleted allocation for migration 27db95be-1bf5-40c0-b1c8-9b141c122d58#033[00m
Nov 25 14:25:33 np0005535656 nova_compute[187219]: 2025-11-25 19:25:33.478 187223 DEBUG nova.virt.libvirt.driver [None req-e138d080-4ec5-49f6-b3f1-db1e0e0eb303 fc07ab8e4ddd41119ec2d141851ebe8c e2023dc5cbc84c69bb179f84d3006c95 - - default default] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 25 14:25:35 np0005535656 nova_compute[187219]: 2025-11-25 19:25:35.062 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:35 np0005535656 nova_compute[187219]: 2025-11-25 19:25:35.387 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:35 np0005535656 podman[197580]: time="2025-11-25T19:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:25:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:25:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2608 "" "Go-http-client/1.1"
Nov 25 14:25:35 np0005535656 nova_compute[187219]: 2025-11-25 19:25:35.811 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:38 np0005535656 nova_compute[187219]: 2025-11-25 19:25:38.277 187223 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764098723.2765057, 71c4cd6e-6474-4a98-91a6-f61169eb7b8f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 25 14:25:38 np0005535656 nova_compute[187219]: 2025-11-25 19:25:38.277 187223 INFO nova.compute.manager [-] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] VM Stopped (Lifecycle Event)#033[00m
Nov 25 14:25:38 np0005535656 nova_compute[187219]: 2025-11-25 19:25:38.303 187223 DEBUG nova.compute.manager [None req-1cdb06af-e0d1-42a8-bfe7-97ec610d6828 - - - - - -] [instance: 71c4cd6e-6474-4a98-91a6-f61169eb7b8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 25 14:25:38 np0005535656 podman[220435]: 2025-11-25 19:25:38.994179544 +0000 UTC m=+0.111172925 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:25:39 np0005535656 nova_compute[187219]: 2025-11-25 19:25:39.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:25:40 np0005535656 nova_compute[187219]: 2025-11-25 19:25:40.092 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:40 np0005535656 nova_compute[187219]: 2025-11-25 19:25:40.389 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:45 np0005535656 nova_compute[187219]: 2025-11-25 19:25:45.095 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:45 np0005535656 nova_compute[187219]: 2025-11-25 19:25:45.391 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:48 np0005535656 podman[220460]: 2025-11-25 19:25:48.995769035 +0000 UTC m=+0.099997873 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:25:49 np0005535656 podman[220459]: 2025-11-25 19:25:49.098907842 +0000 UTC m=+0.212162364 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:25:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:25:50 np0005535656 nova_compute[187219]: 2025-11-25 19:25:50.144 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:50 np0005535656 nova_compute[187219]: 2025-11-25 19:25:50.393 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:53 np0005535656 podman[220507]: 2025-11-25 19:25:53.989114968 +0000 UTC m=+0.099042967 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 14:25:55 np0005535656 nova_compute[187219]: 2025-11-25 19:25:55.147 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:55 np0005535656 nova_compute[187219]: 2025-11-25 19:25:55.395 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:25:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:59.103 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:25:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:59.104 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:25:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:25:59.104 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:25:59 np0005535656 podman[220528]: 2025-11-25 19:25:59.987146874 +0000 UTC m=+0.097973169 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 25 14:26:00 np0005535656 nova_compute[187219]: 2025-11-25 19:26:00.150 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:00 np0005535656 nova_compute[187219]: 2025-11-25 19:26:00.449 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:05 np0005535656 nova_compute[187219]: 2025-11-25 19:26:05.153 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:05 np0005535656 nova_compute[187219]: 2025-11-25 19:26:05.451 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:05 np0005535656 podman[197580]: time="2025-11-25T19:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:26:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:26:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 25 14:26:07 np0005535656 ovn_controller[95460]: 2025-11-25T19:26:07Z|00213|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Nov 25 14:26:09 np0005535656 podman[220549]: 2025-11-25 19:26:09.981664474 +0000 UTC m=+0.094031874 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:26:10 np0005535656 nova_compute[187219]: 2025-11-25 19:26:10.190 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:10 np0005535656 nova_compute[187219]: 2025-11-25 19:26:10.452 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:15 np0005535656 nova_compute[187219]: 2025-11-25 19:26:15.117 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:15 np0005535656 nova_compute[187219]: 2025-11-25 19:26:15.192 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:15 np0005535656 nova_compute[187219]: 2025-11-25 19:26:15.455 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:15 np0005535656 nova_compute[187219]: 2025-11-25 19:26:15.686 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:16 np0005535656 nova_compute[187219]: 2025-11-25 19:26:16.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:17 np0005535656 nova_compute[187219]: 2025-11-25 19:26:17.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:18 np0005535656 nova_compute[187219]: 2025-11-25 19:26:18.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:18 np0005535656 nova_compute[187219]: 2025-11-25 19:26:18.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:26:18 np0005535656 nova_compute[187219]: 2025-11-25 19:26:18.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:26:18 np0005535656 nova_compute[187219]: 2025-11-25 19:26:18.864 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:26:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:26:19 np0005535656 podman[220574]: 2025-11-25 19:26:19.983978063 +0000 UTC m=+0.091430312 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 14:26:20 np0005535656 podman[220573]: 2025-11-25 19:26:20.023525499 +0000 UTC m=+0.138339926 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 14:26:20 np0005535656 nova_compute[187219]: 2025-11-25 19:26:20.194 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:20 np0005535656 nova_compute[187219]: 2025-11-25 19:26:20.457 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:20 np0005535656 nova_compute[187219]: 2025-11-25 19:26:20.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:20 np0005535656 nova_compute[187219]: 2025-11-25 19:26:20.672 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:26:22 np0005535656 nova_compute[187219]: 2025-11-25 19:26:22.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:23.987 104346 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6a:75:de', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:f4:05:d1:77:b1'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 25 14:26:23 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:23.988 104346 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 25 14:26:23 np0005535656 nova_compute[187219]: 2025-11-25 19:26:23.989 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.667 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.671 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.699 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.700 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.700 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.944 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.946 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5873MB free_disk=73.1594009399414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.947 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:26:24 np0005535656 nova_compute[187219]: 2025-11-25 19:26:24.947 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:26:24 np0005535656 podman[220619]: 2025-11-25 19:26:24.956077465 +0000 UTC m=+0.071629609 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.056 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.056 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.090 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.107 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.109 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.110 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.196 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:25 np0005535656 nova_compute[187219]: 2025-11-25 19:26:25.459 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:27 np0005535656 nova_compute[187219]: 2025-11-25 19:26:27.110 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:26:28 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:28.989 104346 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=0dba517c-b8b5-44c5-b9d2-340b509da9f7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 25 14:26:30 np0005535656 nova_compute[187219]: 2025-11-25 19:26:30.198 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:30 np0005535656 nova_compute[187219]: 2025-11-25 19:26:30.492 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:30 np0005535656 podman[220640]: 2025-11-25 19:26:30.944599745 +0000 UTC m=+0.064493128 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 14:26:35 np0005535656 nova_compute[187219]: 2025-11-25 19:26:35.202 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:35 np0005535656 nova_compute[187219]: 2025-11-25 19:26:35.536 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:35 np0005535656 podman[197580]: time="2025-11-25T19:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:26:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:26:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Nov 25 14:26:40 np0005535656 nova_compute[187219]: 2025-11-25 19:26:40.205 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:40 np0005535656 nova_compute[187219]: 2025-11-25 19:26:40.538 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:40 np0005535656 podman[220660]: 2025-11-25 19:26:40.965044743 +0000 UTC m=+0.077769405 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 14:26:45 np0005535656 nova_compute[187219]: 2025-11-25 19:26:45.208 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:45 np0005535656 nova_compute[187219]: 2025-11-25 19:26:45.540 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:26:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:26:50 np0005535656 nova_compute[187219]: 2025-11-25 19:26:50.210 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:50 np0005535656 nova_compute[187219]: 2025-11-25 19:26:50.543 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:50 np0005535656 podman[220685]: 2025-11-25 19:26:50.984476904 +0000 UTC m=+0.094135065 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 14:26:51 np0005535656 podman[220684]: 2025-11-25 19:26:51.004845023 +0000 UTC m=+0.115040668 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 25 14:26:54 np0005535656 ovn_controller[95460]: 2025-11-25T19:26:54Z|00214|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 14:26:55 np0005535656 nova_compute[187219]: 2025-11-25 19:26:55.214 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:55 np0005535656 nova_compute[187219]: 2025-11-25 19:26:55.546 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:26:55 np0005535656 podman[220729]: 2025-11-25 19:26:55.942844398 +0000 UTC m=+0.066873562 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 14:26:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:59.105 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:26:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:59.105 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:26:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:26:59.106 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:27:00 np0005535656 nova_compute[187219]: 2025-11-25 19:27:00.218 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:00 np0005535656 nova_compute[187219]: 2025-11-25 19:27:00.548 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:01 np0005535656 podman[220752]: 2025-11-25 19:27:01.949022993 +0000 UTC m=+0.071957988 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd)
Nov 25 14:27:05 np0005535656 nova_compute[187219]: 2025-11-25 19:27:05.221 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:05 np0005535656 nova_compute[187219]: 2025-11-25 19:27:05.549 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:05 np0005535656 podman[197580]: time="2025-11-25T19:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:27:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:27:05 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Nov 25 14:27:10 np0005535656 nova_compute[187219]: 2025-11-25 19:27:10.223 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:10 np0005535656 nova_compute[187219]: 2025-11-25 19:27:10.552 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:11 np0005535656 podman[220774]: 2025-11-25 19:27:11.932276199 +0000 UTC m=+0.051547328 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:27:15 np0005535656 nova_compute[187219]: 2025-11-25 19:27:15.226 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:15 np0005535656 nova_compute[187219]: 2025-11-25 19:27:15.553 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:16 np0005535656 nova_compute[187219]: 2025-11-25 19:27:16.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:17 np0005535656 nova_compute[187219]: 2025-11-25 19:27:17.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:18 np0005535656 nova_compute[187219]: 2025-11-25 19:27:18.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:27:19 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:27:19 np0005535656 nova_compute[187219]: 2025-11-25 19:27:19.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:19 np0005535656 nova_compute[187219]: 2025-11-25 19:27:19.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 25 14:27:19 np0005535656 nova_compute[187219]: 2025-11-25 19:27:19.673 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 25 14:27:19 np0005535656 nova_compute[187219]: 2025-11-25 19:27:19.948 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 25 14:27:20 np0005535656 nova_compute[187219]: 2025-11-25 19:27:20.229 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:20 np0005535656 nova_compute[187219]: 2025-11-25 19:27:20.556 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:21 np0005535656 nova_compute[187219]: 2025-11-25 19:27:21.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:21 np0005535656 nova_compute[187219]: 2025-11-25 19:27:21.850 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:21 np0005535656 nova_compute[187219]: 2025-11-25 19:27:21.851 187223 DEBUG nova.compute.manager [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 25 14:27:21 np0005535656 podman[220800]: 2025-11-25 19:27:21.982186441 +0000 UTC m=+0.082058280 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 25 14:27:22 np0005535656 podman[220799]: 2025-11-25 19:27:22.023507684 +0000 UTC m=+0.129503758 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 25 14:27:23 np0005535656 nova_compute[187219]: 2025-11-25 19:27:23.673 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:24 np0005535656 nova_compute[187219]: 2025-11-25 19:27:24.672 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.054 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.055 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.055 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.056 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.286 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.354 187223 WARNING nova.virt.libvirt.driver [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.355 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.15941619873047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.356 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.356 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:27:25 np0005535656 nova_compute[187219]: 2025-11-25 19:27:25.558 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.072 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.073 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.223 187223 DEBUG nova.compute.provider_tree [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed in ProviderTree for provider: 752b63a7-2ce2-4d83-a281-12c9803714ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.823 187223 DEBUG nova.scheduler.client.report [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Inventory has not changed for provider 752b63a7-2ce2-4d83-a281-12c9803714ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.826 187223 DEBUG nova.compute.resource_tracker [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 25 14:27:26 np0005535656 nova_compute[187219]: 2025-11-25 19:27:26.826 187223 DEBUG oslo_concurrency.lockutils [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:27:26 np0005535656 podman[220842]: 2025-11-25 19:27:26.950232214 +0000 UTC m=+0.068256168 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Nov 25 14:27:28 np0005535656 nova_compute[187219]: 2025-11-25 19:27:28.822 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:28 np0005535656 nova_compute[187219]: 2025-11-25 19:27:28.822 187223 DEBUG oslo_service.periodic_task [None req-f3bb39de-d0b2-4da3-a875-f0f72485319f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 25 14:27:30 np0005535656 nova_compute[187219]: 2025-11-25 19:27:30.289 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:30 np0005535656 nova_compute[187219]: 2025-11-25 19:27:30.568 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:32 np0005535656 podman[220865]: 2025-11-25 19:27:32.926410331 +0000 UTC m=+0.054080947 container health_status 1f1e331a0778d38de074020dc88d6525b3f6177ff1a55fe8123a491fca889c60 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 25 14:27:35 np0005535656 nova_compute[187219]: 2025-11-25 19:27:35.291 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:35 np0005535656 nova_compute[187219]: 2025-11-25 19:27:35.607 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:35 np0005535656 podman[197580]: time="2025-11-25T19:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 14:27:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Nov 25 14:27:35 np0005535656 podman[197580]: @ - - [25/Nov/2025:19:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 25 14:27:39 np0005535656 systemd-logind[788]: New session 48 of user zuul.
Nov 25 14:27:39 np0005535656 systemd[1]: Started Session 48 of User zuul.
Nov 25 14:27:40 np0005535656 nova_compute[187219]: 2025-11-25 19:27:40.294 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:40 np0005535656 nova_compute[187219]: 2025-11-25 19:27:40.608 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:42 np0005535656 podman[221027]: 2025-11-25 19:27:42.684049653 +0000 UTC m=+0.090705223 container health_status 7c3f1f08c320312811762b91638e0e392637fc9db0c9256e8da4575247433763 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 14:27:44 np0005535656 ovs-vsctl[221083]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 25 14:27:45 np0005535656 nova_compute[187219]: 2025-11-25 19:27:45.296 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:45 np0005535656 nova_compute[187219]: 2025-11-25 19:27:45.651 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:45 np0005535656 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220913 (sos)
Nov 25 14:27:45 np0005535656 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 25 14:27:45 np0005535656 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 25 14:27:46 np0005535656 virtqemud[186765]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 25 14:27:46 np0005535656 virtqemud[186765]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 25 14:27:46 np0005535656 virtqemud[186765]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: ERROR   19:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 14:27:49 np0005535656 openstack_network_exporter[199738]: 
Nov 25 14:27:49 np0005535656 systemd[1]: Starting Hostname Service...
Nov 25 14:27:49 np0005535656 systemd[1]: Started Hostname Service.
Nov 25 14:27:50 np0005535656 nova_compute[187219]: 2025-11-25 19:27:50.298 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:50 np0005535656 nova_compute[187219]: 2025-11-25 19:27:50.704 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:52 np0005535656 podman[221760]: 2025-11-25 19:27:52.974728287 +0000 UTC m=+0.076401148 container health_status e54d07521f1640c8941e915c72a7277cf6d4a2a719a3b49d6665099fb5b85e3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 14:27:52 np0005535656 podman[221759]: 2025-11-25 19:27:52.99563548 +0000 UTC m=+0.115836769 container health_status b25190214ab48e72034992e2b624b9896f7a7f8bd38c1ece4d7b34aa9ff7fbff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 14:27:55 np0005535656 nova_compute[187219]: 2025-11-25 19:27:55.339 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:55 np0005535656 nova_compute[187219]: 2025-11-25 19:27:55.706 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:27:57 np0005535656 podman[221805]: 2025-11-25 19:27:57.952197893 +0000 UTC m=+0.074291061 container health_status 259ccb1bd669632f6ed5ae6d300bbafccb4c1c1c4d4f4f8a3e441a530c3d05e7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 25 14:27:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:27:59.106 104346 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 25 14:27:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:27:59.106 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 25 14:27:59 np0005535656 ovn_metadata_agent[104341]: 2025-11-25 19:27:59.106 104346 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 25 14:28:00 np0005535656 nova_compute[187219]: 2025-11-25 19:28:00.342 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:28:00 np0005535656 nova_compute[187219]: 2025-11-25 19:28:00.706 187223 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 25 14:28:01 np0005535656 nova_compute[187219]: 2025-11-25 19:28:01.544 187223 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.36 sec#033[00m
